========================================= File 7jEVAWXP.TXT ----------------------------------------- Evaluation of results for File, Macro and Script Virus/Malware detection under Windows-XP in aVTC Test "2004-07": ========================================= Formatted with non-proportional font (Courier) ******************************************************************* Content of this file: ******************************************************************* Eval WXP: Development of detection rates under Windows-XP: ******************************************************************* Eval WXP.01: Development of Windows-XP Scanner Detection Rates Table WXP-A: Comparison File/Macro/Script virus detection rates Eval WXP.02: In-The-Wild Detection under WXP Eval WXP.03: Evaluation of overall WXP AV detection rates Eval WXP.04: Evaluation of detection by virus classes under WXP WXP.04.1 Grading the Detection of file viruses WXP.04.2 Grading the Detection of macro viruses WXP.04.3 Grading the Detection of script viruses Eval WXP.05: Detection of Packed Viruses by virus classes under WXP + AND Loss of Virus Detection through Packing Eval WXP.06: Avoidance of False Alarms (Macro) under WXP Eval WXP.07: Detection of Malware by virus class (file,macro,script) Eval WXP.SUM Grading of WXP products ******************************************************************* This part of VTC "2004-07" test report evaluates the detailed results as given in sections (files): 6jWXP.TXT File/Macro/Script Viruses/Malware results (WXP) The following *25* products participated in this scanner test for WXP products: =============================================================== Products submitted for aVTC test under Windows-XP: =============================================================== ANT = Antivir v:6.18.0.1 H+B EDV Datentechnik Germany AVA = Avast! v:0301-9 ALWIL Software, Czech Republic AVG = AVG Antivirus System v:6.0.456 GriSoft,Czech Republic AVK = AntiVirenKit 10 v.12.0.3 GData Software, Germany AVP = Kaspersky Anti-Virus (KAV), v:4.0.5.37 Kaspersky Lab., Russia BDF = BitDefender Professional v:7.0 build2473 SOFTWIN, Romania CMD = Command Antivirus v:4.74.3 Command Software Systems, USA DRW = Dr. Web v:v4.29b DialogueScience, Russia FIR = Fire Anti-virus Kit v:2.7 Prognet Technologies India FPR = F-PROT v:3.12d Frisk Software Intnl. Iceland FSE = F-SECURE v:1.02.2410 F-Secure Corporation, Finland GLA = Gladiator AV v:3.0.0 "Gladiator" IKA = Ikarus Virus Utilities v:2.27 IKARUS Software Austria INO = eTrust AV v:6.0.102 Computer Associates Intnl. USA NAV = Norton Antivirus v:8.00.9374 Symantec, USA NVC = Norman Virus Control v:5.50 Norman Data Defense, Norway PAV = Power AV v: 11.0.5 GData Software, Germany PER = Peruvian AntiVirus v:7.90 PER Systems, Peru PRO = Protector v:7.2.D01 Proland Software, India QHL = Qhickheal 6.08 Cat Computer Services India RAV = RAV Antivirus v8 v:8.3.1 GeCAD Software, Romania SCN = McAfee ViruScan v4.1.60 Network Associates, USA SWP = Sophos AV v:3.66 Sophos, UK VBR = VirusBuster v:2 Leprechaun Australia VSP = VirScanPlus v:12.762 Ralph Roth, Germany =============================================================== Eval WXP.01: Scanner Detection Rates under Windows-XP: ======================================================== The following table summarizes results of file, macro and script virus detection under Windows-XP (this is aVTCs second test under WXP; for other Windows platforms, see related reports): Table WXP-A: File/Macro/Script Zoo Virus Detection Rates: ================================================================ Scan I === File Virus === + == Macro Virus === + === Script Virus === ner I Detection I Detection I Detection -----+--------------------+--------------------+---------------------- Test I 0304 0407 Delta I 0304 0407 Delta I 0304 0407 Delta -----+--------------------+------------------------------------------- ANT I - 90.9 - I - 97.9 - I - 87.5 - AVA I - 95.7 - I - 97.9 - I - 88.8 - AVG I - 79.8 - I - 98.0 - I - 68.2 - AVK I - 100~ - I - 100~ - I - 99.8 - AVP I 100~ 100~ 0.0 I 100~ 100~ 0.0 I 98.9 99.7 +0.8 BDF I 82.9 84.4 +1.5 I 99.0 98.1 -0.9 I 72.4 94.3 +22.1 CMD I 98.5 98.6 +0.1 I 99.9 99.9 0.0 I 89.1 98.5 +9.4 DRW I 98.3 77.8 -20.5 I 99.4 99.4 0.0 I 94.7 95.4 +0.7 FIR I - 75.4 - I - 85.9 - I - 75.9 - FPR I - 99.5 - I - 99.9 - I - 99.3 - FSE I 100~ 100~ 0.0 I 100~ 100~ 0.0 I 99.5 100% +0.5 GLA I - 40.7 - I - 1.5 - I - 49.5 - IKA I - 90.5 - I - 76.5 - I - 91.7 - INO I 98.7 95.8 -2.9 I 99.9 99.9 0.0 I 94.7 97.5 +2.8 NAV I 98.3 99.3 +1.0 I 99.6 99.9 +0.3 I 96.8 98.7 +1.9 NVC I 97.8 95.0 -2.8 I 99.8 99.2 -0.6 I 87.6 86.3 -1.3 PAV I - 100~ - I - 100~ - I - 99.7 - PER I - 35.9 - I - 69.8 - I - 22.9 - PRO I - 67.2 - I - 73.1 - I - 70.4 - QHL I - 56.0 - I - - - I - 29.1 - RAV I 96.7 99.4 +2.7 I 99.9 99.8 -0.1 I 96.1 99.7 +1.6 SCN I 99.8 100~ +0.2 I 100% 100~ -0.0 I 99.6 100% +0.4 SWP I - 98.2 - I - 99.7 - I - 96.8 - VBR I - 68.5 - I - 98.4 - I - 46.4 - VSP I - 14.6 - I - 0.1 - I - 83.5 - -----+--------------------+--------------------+-------------------- Mean : 97.1 82.6% -2.1%I 99.8 87.3% -0.1 I 92.9 83.2% +3.9% >10%: 97.1 82.6% -2.1%I 99.8 95.2% -0.1 I 92.9 83.2% +3.9% -----+--------------------+--------------------+-------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Concerning detection of zoo viruses in ALL three categories (file, macro AND script), no product is "perfect" in detecting ALL viruses but 3 products are "excellent" as they detect more than 99% in all categories: FSE, SCN (100~ 100~ 100%), AVP (100~ 100~ 99.7%) Concerning FILE zoo virus detection, NO product is able to detect ALL viruses (rating: "perfect"), but 3 products detect more than 99% and are rated "excellent": AVP, FSE, SCN (all: 100~) Concerning macro zoo virus detection, NO product detects ALL viruses and is rated "perfect" (last time: one product) In addition, 13 (of 25) products detect >=99% of viruses and are rated "excellent": AVK,AVP,FSE,PAV,SCN (all: 100~), CMD,FPR,INO,NAV (all: 99.9%), RAV (99.8%), SWP (99.7%), DRW (99.4%) and NVC (99.2%) Concerning script zoo virus detection, 2 products detect ALL viruses (last time: NO product) and are rated "perfect": FSE, SCN (100%) 5 products detect more than 99% and are rated "excellent": AVK (99.8%), AVP,PAV,RAV (99.7%), FPR (99.3%) ****************************************************************** Findings WXP.1: In comparison with last test (where 10 good products were selected), mean detection rate is significantly reduced (due to some products with very bad detection rates): mean file zoo virus detection rate: 82.6% mean macro virus detection rate: 87.3% mean script virus detection rate: 83.2% -------------------------------------------------- Concerning detection of zoo viruses in all categories (file, macro and script), no product is "perfect" but 3 are "excellent" as they detect at least 99% of all zoo viruses: FSE,SCN,AVP -------------------------------------------------- Concerning file zoo viruses: NO product detects ALL viruses ("perfect") 3 products detect more than 99% and are rated "excellent": AVP,FSE,SCN -------------------------------------------------- Concerning macro zoo viruses: NO product detects ALL macro zoo viruses in all files and is rated "perfect": --- 13 products detect at least 99% macro viruses and are rated "excellent": AVK,AVP,FSE,PAV,SCN,CMD, FPR,INO,NAV,RAV,SWP,DRW,NVC -------------------------------------------------- Concerning script zoo viruses: 2 products detect ALL viruses and are rated "perfect": FSE,SCN 5 products detect at least 99% of script viruses and are rated "excellent": AVK,AVP,PAV,RAV,FPR ***************************************************************** Eval WXP.02: In-The-Wild (File,Macro,Script) Detection under WXP ================================================================ Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is an ABSOLUTE REQUIREMENT, for file, macro and script viruses to be rated "perfect", also including detection of ALL infected objects. In comparison with last test (2003-04), mean detection rates have siginifactly lost: Mean detection rate - of ITW file viruses: 91.5% (last test: 99.8%) - of ITW macro viruses: 91.5% (last test: 100%) - of ITW script viruses: 95.1% (last test: 100%) Even if one admits that last test included 10 best scanners (selected for aVTCs first WXP test), in comparison with 25 scanners in this test including some products with "reduced" detection, mean detection rates of ITW viruses should be generally in the order of 98-99%! Presently, only 1 scanner is rated "perfect" in detecting ALL ITW viruses in ALL 3 classes (file, macro and script): SCN (100% viruses, 100% infected objects). In comparison with last test where 5 scanners achieved "perfect" rating, this is a significant loss of detection. Remark: While several products are successful in 100% macro and script ITW virus detection, loss of ITW file virus detection of all but one product is essentially due to non-detection of ONE ITW virus with ONE samples. Following aVTCs long established policy, product submission dates for this test were 6 weeks AFTER freezing testbeds, esp. including ITW testbeds. Consequently, non-detection of an ITW virus which was known since at least 6 weeks before product delivery for testing is an indication of inadequate management of virus signatures. ITW virus/file detection ( FileV. MacroV. ScriptV. ) ---------------------------------- "Perfect" WXP ITW scanners: SCN (100% 100%; 100% 100%; 100% 100%) ---------------------------------- Next best ITW scanners: AVK (99.1 99.8; 100% 100%; 100% 100%) AVP (99.1 99.8; 100% 100%; 100% 100%) FSE (99.1 99.8; 100% 100%; 100% 100%) NAV (99.1 99.8; 100% 100%; 100% 100%) PAV (99.1 99.8; 100% 100%; 100% 100%) ---------------------------------- Concerning detection of ITW file viruses only: 1 scanner is "perfect" (last test: 5): SCN (100% 100%) 11 next best scanners (unrated) miss 1 virus/1 sample: AVK,AVP,DRW,FPR,FSE,NAV,PAV,RAV (99.1% 99.8%), INO,SWP (99.1% 99.5%), AVA (99.1% 99.1%) Concerning detection of ITW macro viruses only: 10 scanners are rated "perfect" (last test: 6): ANT,AVK,AVP,BDF,DRW,FSE,NAV,PAV,SCN,SWP: all (100% 100%) 8 more scanners are rated "excellent": AVG,CMD,FPR,INO,RAV: all (100% 99.9%), AVA (100% 99.5%), IKA (100% 99.4%) PRO (100% 99.3%) Concerning detection of ITW script viruses only: 6 scanners are rated "perfect" (last test: 8): AVK,AVP,FSE,NAV,PAV,SCN: all (100% 100%) 2 more scanner are rated "excellent": FPR,INO,RAV: all (100% 99.4%) ****************************************************************** Findings WXP.2: Mean detection rates of ITW file/macro/script viruses has significantly decreased in comparison with last test: Mean detection rate of - ITW file viruses: 91.5% (last test:99.8%) - ITW macro viruses: 91.5% (last test: 100%) - ITW script viruses: 95.1% (last test: 100%) ************************************************* Only 1 AV product (out of 25) detects ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files) and is rated "perfect": SCN 5 more scanners miss one ITW file virus but detect all ITW macro and script viruses: AVK,AVP,FSE,NAV,PAV ************************************************* Concerning detection of ITW file viruses only: 1 "perfect" scanner (last test: 5): SCN 11 next best scanners (unrated): AVK,AVP,DRW,FPR,FSE,NAV,PAV,RAV,INO,SWP,AVA Concerning detection of ITW macro viruses only: 10 "perfect" scanners (last test: 6): ANT,AVK,AVP,BDF,DRW,FSE,NAV,PAV,SCN,SWP 8 "excellent" scanners: AVG,CMD,FPR,INO,RAV,AVA,IKA,PRO Concerning detection of ITW script viruses: 6 "perfect" scanners (last test: 8): AVK,AVP,FSE,NAV,PAV,SCN 3 more "excellent" scanners: FPR,INO,RAV ***************************************************************** Eval WXP.03: Evaluation of overall W-XP AV detection rates (zoo,ITW) ==================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file, macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Remark: due to non-detection of ONE ITW virus, many scanners which were previoulsy rated "perfect" or "excellent" could not be rated as 100% ITW detection is "condition sine qua non" (aka: necessary condition). Zoo test: ITW test: (file/macro/script; file/macro/script) -------------------------------------- "Perfect" WXP scanners: ========= NONE ========= "Excellent" WXP scanners: SCN (99.8% 100% 99.6%; 100% 100% 100%) -------------------------------------- "Very Good" WXP scanners: =========== NO PRODUCT ========== -------------------------------------- ****************************************************************** Findings WXP.3: Now, NO WXP product is overall rated "perfect" 1 "excellent" overall scanner: SCN NO "very good" overall scanners: --- ****************************************************************** Eval WXP.04: Evaluation of detection by virus classes under Windows-XP: ======================================================================= Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. WXP.04.1 Grading the Detection of file zoo viruses under WXP: ------------------------------------------------------------- "Perfect" WXP file scanners: === NONE === "Excellent" WXP file scanners: AVK (100~) AVP (100~) FSE (100~) PAV (100~) SCN (100~) FPR (99.5%) RAV (99.4%) NAV (99.3%) "Very Good" WXP file scanners: CMD (98.6%) SWP (98.2%) INO (95.8%) AVA (95.7%) NVC (95.0%) WXP.04.2 Grading the Detection of macro zoo viruses under WXP: -------------------------------------------------------------- "Perfect" WXP macro scanners: === NONE === "Excellent" WXP macro scanners: AVK (100~) AVP (100~) FSE (100~) PAV (100~) SCN (100~) CMD (99.9%) FPR (99.9%) INO (99.9%) NAV (99.9%) RAV (99.8%) SWP (99.7%) DRW (99.4%) NVC (99.2%) "Very Good" WXP macro scanners VBR (98.4%) BDF (98.1%) AVG (98.0%) ANT (97.9%) AVA (97.9%) WXP.04.3 Grading the Detection of Script zoo viruses under WXP: --------------------------------------------------------------- "Perfect" WXP script scanners: FSE (100%) SCN (100%) "Excellent" WXP script scanners: AVK (99.8%) AVP (99.7%) PAV (99.7%) RAV (99.7%) FPR (99.3%) "Very Good" WXP script scanners: NAV (98.7%) CMD (98.5%) INO (97.5%) SWP (96.8%) DRW (95.4%) *********************************************************************** Finding WXP.4: Performance of WXP scanners by virus classes: 0 Perfect scanners for file zoo: --- 8 Excellent scanners for file zoo (last time:5): AVK,AVP,FSE,PAV,SCN,FPR,RAV,NAV 5 Very Good scanners for file zoo: CMD,SWP,INO,AVA,NVC 0 Perfect scanner for macro zoo (last time:1): --- 13 Excellent scanners for macro zoo (last time:5): AVK,AVP,FSE,PAV,SCN,CMD,INO,FPR,NAV,RAV,SWP,DRW,NVC 7 Very Good scanners for macro zoo (last time:0): VBR,BDF,AVG,AVA,ANT,AVA,IKA 2 Perfect scanners for script zoo (last time:0): FSE,SCN 5 Excellent scanners for script zoo (last time:2): AVK,AVP,PAV,RAV,FPR 5 Very Good scanners for script zoo (last time:3): NAV,CMD,INO,DRW,SWP *********************************************************************** Eval WXP.05: Detection of Packed File and Macro Viruses under Windows-XP ======================================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with 6 popular methods (PKZIP, ARJ, LHA, RAR 1.5, WinRAR 3.0 and CAB) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially been unchanged to ease comparison and improvement. Following analysis of detection of ITW file AND macro viruses where 100% ITW detection is mandatory, the following grading is applied: "perfect": 100% detection for ITW file AND macro viruses packed with 6 packers "excellent": 100% detection for ITW file AND macro viruses packed with 5 packers "very good": 100% detection for ITW file AND macro viruses packed with 4 packers "good": 100% detection for ITW file AND macro viruses packed with 3 packers ********************************************************** No product detected ITW samples packed with ALL 6 packers. ********************************************************** Due to the fact that only ONE products fulfils the conditions "100% ITW detection" for both macro and files viruses (whereas others MISS at least one virus in file ITW testbed) but only for 5 packers, only one product can be rated at all: Concerning detection of BOTH file and macro virus samples: A "excellent" product would detect at least ALL viral samples (100%) with at least 5 packers: ----------------------------------------------------- "Excellent" packed virus detectors: SCN ----------------------------------------------------- Concerning detection of packed file viruses only, all products except one miss at least one sample of an ITW file virus: "Perfect" packed file virus detectors: --- "Excellent" packed file virus detectors: SCN "Very Good" packed file virus detectors: --- "Good" packed file virus detectors: --- Comment: 16 products missed just ONE ITW virus, but VTCs strong requirement for grading is that ALL ITW viruses MUST be detected! In comparison, several products are significantly more successful in detecting ALL packed ITW macro viruses: "Perfect" packed macro virus detectors: AVK,AVP,BDF,FSE,PAV "Excellent" packed macro virus detectors: AVA,DRW,QHL,RAV,SCN "Very Good" packed macro virus detectors: FPR,INO One NEW table analyses whether all ITW viruses which a product detects in UNPACKED form is also detected when PACKED with one of 6 packers. This is applied to ALL products including those which do NOT fulfil the "100% ITW detection" criterion. The related tables show that some scanners detect ALL ITW viruses RELIABLY both in unpacked and packed forms. But some scanners show significant LOSSES of detection for file and macro viruses (tables WXP.F3L and WXP.M3L). "Perfect scanners" (6 packers): No LOSS in detection of file AND macro ITW viruses: AVP,FSE No LOSS in detection of ITW file viruses: AVK,AVP,FSE No LOSS in detection of ITW macro viruses: AVP,FSE "Excellent scanners" (5 packers): No LOSS in detection of file AND macro ITW viruses: PAV,RAV,SCN No LOSS in detection of ITW file viruses: PAV,RAV,SCN No LOSS in detection of ITW macro viruses: AVA,BDF,DRW,PAV,QHL,RAV,SCN "Very Good scanners" (4 packers): No LOSS in detection of file AND macro ITW viruses: SWP No LOSS in detection of ITW file viruses: BDF,DRW,SWP No LOSS in detection of ITW macro viruses: INO,NAV,SWP "Good scanners" (3 packers): No LOSS in detection of file AND macro ITW viruses: CMD,FPR No LOSS in detection of ITW file viruses: CMD,FPR,NAV No LOSS in detection of ITW macro viruses: AVK,CMD,FPR,PRO *********************************************************************** Findings WXP.5: Concerning OVERALL detection of packed file AND macro viruses, NO product is "perfect": --- And 1 products is "excellent": SCN No product is "very good" or "good": --- ******************************************************* Concerning detection of packed FILE viruses: NO product is "perfect": --- 1 product is "excellent": SCN ******************************************************* Concerning detection of packed MACRO viruses: 5 products are "perfect": AVK,AVP,BDF,FSE,PAV 5 products are "excellent": AVA,DRW,QHL,RAV,SCN 2 products are "very good": FPR,INO ******************************************************* Concerning EQUAL detection of UNPACKED AND PACKED of ITW file AND macro viruses: 2 "perfect" products have NO LOSS for 6 packers: AVP,FSE 3 "excellent" products have NO LOSS for 5 packers: PAV,RAV,SCN 1 "very good" product has NO LOSS for 4 packers: SWP 2 "good" products have NO LOSS for 3 packers: CMD,FPR *********************************************************************** Eval WXP.06: Avoidance of False Alarms (File, Macro) under Windows-XP: ====================================================================== First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the file and macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" It is good to observe that ALL 25 scanners avoid FP alerts on clean files, but concerning clean macro objects, only 11 (out of 25) products are "perfect" in avoiding any alarm, and 7 more products are "excellent" as they only alert on at most 2 samples (<0.6%). Remark: the testbed included 2 CLEAN (non-viral, non-malicious) macro objects which were taken from a goat generator. While there is no reason for alerting on such samples, SOME AV experts argue that these samples must be detected as they may be used by VX groups. Indeed, some scanner diagnosed those samples as "infection" which is misleading and this is counted as "false positive diagnosis" (a warning would be acceptable). Overall, the following products didnot issue any false alarm: --------------------------------------------------------------------- "Perfect" file-FP AND macro-FP avoiding W2k products: ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP "Excellent" file-FP AND macro-FP-avoiding W2k products: AVK,AVP,CMD,FPR,FSE,PAV,RAV --------------------------------------------------------------------- "Perfect" file-FP avoiding W2k products: ANT,AVA,AVG,AVK,AVP,BDF, CMD,DRW,FIR,FPR,FSE,GLA,IKA,INO,NAV, NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP "Excellent" file-FP-avoiding W2k product: --- --------------------------------------------------------------------- "Perfect" macro-FP-avoiding W2k products: ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP "Excellent" macro-FP-avoiding W2k products: AVK,AVP,CMD,FPR,FSE,PAV,RAV --------------------------------------------------------------------- ******************************************************************** Findings WXP.06: Avoidance of False-Positive Alarms is rather well developed, at least for file-FP avoidance. 11 Overall FP-avoiding "perfect" W2k scanners: ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP 7 more products are "excellent": AVK,AVP,CMD,FPR,FSE,PAV,RAV *************************************************** Concerning file-FP avoidance, ALL 25 products are "perfect": ANT,AVA,AVG,AVK,AVP,BDF, CMD,DRW,FIR,FPR,FSE,GLA,IKA,INO,NAV, NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP *************************************************** Concerning macro-FP avoidance, 11 products are "perfect": ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP 7 products are "excellent": AVK,AVP,CMD,FPR,FSE,PAV,RAV ******************************************************************** Eval WXP.07: Detection of File, Macro and Script Malware under Windows-XP ========================================================================= ince test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Generally, detection of malware needs significant further development, as mean detection rates over platforms have further deteriorated (except for script malware) since last test (2003-04): Mean detection rate for file malware: 68.2% (73.6% for scanners >10%) for macro malware: 84.2% (91.4% for scanners >10%) for script malware: 62.0% (64.4% for scanners >10%) In comparison to the last test (2003-04), were: Mean detection rate for file malware: from 81.3% to 68.2%: MUCH REDUCED for macro malware: from 96.6% to 84.2%: MUCH REDUCED. for script malware: from 67.8% to 62.0%: REDUCED. Comment: the reduction in mean macro malware detection is due to extremely low detection by some products, whereas the loss in file malware detection is caused by the strong increase of W32 malware strains such as Spypot, Randex et al which are evidently not available to all manufacturers. Concerning File, Macro AND Script malware detection: ------------------------------------------------------------ "Perfect" file/macro/script malware detectors under WXP: --- ------------------------------------------------------------ "Excellent" file/macro/script malware detectors under WXP: FSE ( 99.8% 100% 98.5% ) AVK ( 99.7% 100% 98.5% ) AVP ( 99.4% 100% 98.2% ) PAV ( 99.3% 100% 97.9% ) SCN ( 97.7% 99.8% 98.2% ) RAV ( 91.3% 99.2% 93.3% ) ------------------------------------------------------------ "Very Good" file/macro/script malware detector under WXP: NAV ( 92.8% 98.4% 89.7% ) FPR ( 98.9% 100% 89.1% ) AVA ( 80.3% 95.9% 86.7% ) ------------------------------------------------------------ Concerning only file malware detection only: ------------------------------------------------------------ "Perfect" macro malware detectors under WXP: --- "Excellent" macro malware detectors under WXP: FSE(99.8%), AVK(99.7%), AVP(99.4%), PAV(99.3%), FPR(98.9%), SCN(97.7%), NAV(92.8%), RAV(91.3%) "Very Good" macro malware detectors under WXP: CMD,SWP(89.0%), INO(83.5%) ------------------------------------------------------------ Concerning only macro malware detection (relatively best developped): ------------------------------------------------------------ "Perfect" macro malware detectors under WXP: AVK,AVP,CMD,FPR,FSE,PAV(100%) "Excellent" macro malware detectors under WXP: SCN(99.8%),INO(99.4%),RAV(99.2%),DRW(99.0%), NAV,SWP(98.4%),NVC(97.1%),BDF(96.3%),AVA(95.9%), VBR(94.8%),IKA(91.5%) "Very Good" macro malware detectors under WXP: ANT(86.4%) ------------------------------------------------------------ And concerning script malware detection only: ------------------------------------------------------------ "Perfect" script malware detectors under WXP: --- "Excellent" script malware detectors under WXP: AVK,FSE(98.5%), AVP,SCN(98.2%), PAV(97.9%),RAV(93.3%) "Very Good" script malware detectors under WXP: NAV(89.7%),FPR(89.1%),AVA(86.7%) ------------------------------------------------------------ ******************************************************************* Findings WXP.7: Mean file/macro/script malware detection rates have further det eriorated since last test to an inacceptably low level. Mean detection rates: for file malware: 68.2% (73.6% for scanners >10%) for macro malware: 84.2% (91.4% for scanners >10%) for script malware: 62.0% (64.4% for scanners >10%) *************************************************** Concerning ALL products, file/macro/script malware detection under WXP for ALL platforms needs significant improvement: 0 products are "perfect": --- 6 products are "excellent": FSE,AVK,AVP,PAV,SCN,RAV 3 products are "very good": NAV,FPR,AVA *************************************************** Concerning only file malware detection, 0 product is "perfect": --- 8 products are "excellent": FSE,AVK,AVP,PAV,FPR,SCN,NAV,RAV 3 products are rated "very good": CMD,SWP,INO *************************************************** Concerning only macro malware detection, 6 products are "perfect": AVK,AVP,CMD,FPR,FSE,PAV 11 products are "excellent": SCN,INO,RAV,DRW,NAV,SWP,NVC,BDF,AVA,VBR,IKA 1 product is rated "very good": ANT *************************************************** Concerning only script malware detection, 0 products are "perfect": --- 6 products are "excellent": AVK,FSE,AVP,SCN,PAV,RAV 1 product is rated "very good": NAV,FPR,AVA ******************************************************************* Eval WXP.SUM: Grading of Windows-XP products: ============================================= Under the scope of VTCs grading system, a "Perfect WXP AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all 6 popular packers, and +2A) Will detect ALL ITW samples both in unpacked instantiations AND packed with ALL (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ***************************************************************** In VTC test "2004-07", we found *** NO perfect WXP AV product *** and we found *** No perfect WXP AM product *** ***************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------------ W2k file ITW test: SCN AVK,AVP,DRW,FPR,FSE,NAV, PAV,RAV,INO,SWP,AVA W2k macro ITW test: ANT,AVK,AVP,BDF,DRW, AVG,CMD,FPR,INO,RAV, FSE,NAV,PAV,SCN,SWP AVA,IKA,PRO W2k script ITW test: AVK,AVP,FSE,NAV, FPR,INO,RAV PAV,SCN ------------------------------------------------------------------ W2k file zoo test --- AVK,AVP,FSE,PAV,SCN, FPR,RAV,NAV W2k macro zoo test: --- AVK,AVP,FSE,PAV,SCN,CMD, INO,FPR,NAV,RAV,SWP,DRW,NVC W2k script zoo test: FSE,SCN AVK,AVP,PAV,RAV,FPR ------------------------------------------------------------------ W2k file pack test: --- SCN W2k macro pack test: AVK,AVP,BDF,FSE,PAV AVA,DRW,QHL,RAV,SCN + W2k pack/unpack test: AVP,FSE PAV,RAV,SCN ------------------------------------------------------------------ W2k file FP avoidance: ANT,AVA,AVG,AVK, --- AVP,BDF,CMD,DRW,FIR,FPR,FSE, GLA,IKA,INO,NAV,NVC,PAV,PER, PRO,QHL,RAV,SCN,SWP,VBR,VSP W2k macro FP avoidance: ANT,AVA,AVG,BDF, AVK,AVP,CMD,FPR, GLA,INO,NAV,PRO,SCN,SWP,VSP FSE,PAV,RAV ------------------------------------------------------------------ W2k file malware test: --- FSE,PAV,AVK,FPR, AVP,SCN,NAV,RAV W2k macro malware test: AVK,AVP,CMD,FSE,PAV SCN,INO,RAV,DRW,NAV, SWP,NVC,BDF,AVA,IKA,VBR W2k script malware test: --- AVK,FSE,AVP,SCN,PAV,RAV ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this WXP test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" Windows-XP AntiVirus product: =NONE= (22 points) "Excellent" Windows-XP products: 1st place: SCN (17 points) 2nd place: FSE (16 points) 3rd place: AVP (15 points) 4th place: AVK,PAV (13 points) 6th place: NAV,RAV (11 points) 8th place: FPR ( 9 points) 9th place: BDF,INO,SWP ( 8 points) 12th place: AVA,DRW ( 7 points) 14th place: ANT ( 6 points) 15th place: AVG,CMD,PRO ( 5 points) 18th place: GLA,VSP ( 4 points) 20th place: IKA,NVC,QHL ( 3 points) 22th place: FIR,PER,VBR ( 2 points) ************************************************************ "Perfect" Windows-XP AntiMalware product:=NONE= (28 points) "Excellent" Windows-XP AntiMalware product: 1st place: SCN,FSE (20 points) 3rd place: AVP (19 points) 4th place: AVK,PAV (17 points) 6th place: RAV (14 points) 7th place: NAV (13 points) 8th place: FPR (10 points) 9th place: BDF,INO,SWP ( 9 points) 12th place: AVA,DRW ( 8 points) 14th place: CMD ( 7 points) 15th place: IKA,NVC ( 4 points) 17th place: VBR ( 3 points) ************************************************************