==================================== File: 7EVAL-WNT.TXT ------------------------------------ Evaluation of results for Macro and Script Virus/Malware detection under Windows-NT in VTC Test "2001-10": ==================================== Formatted with non-proportional font (Courier) Content of this file: ********************************************************************** Eval WNT: Development of detection rates under Windows-NT: ********************************************************************** Eval WNT.01: Development of W-NT Scanner Detection Rates Table WNT-A2: Macro/Script Virus Detection Rate in last 7 VTC tests Eval WNT.02: In-The-Wild (Macro,Script) Detection under W-NT Eval WNT.03: Evaluation of overall W-NT AV detection rates (zoo,ITW) Eval WNT.04: Evaluation of detection by virus classes under W-NT WNT.04.2 Grading the Detection of macro viruses WNT.04.3 Grading the Detection of script viruses Eval WNT.05: Detection of Packed Macro Viruses under W-NT Eval WNT.06: Avoidance of False Alarms (Macro) under W-NT Eval WNT.07: Detection of Macro and Script Malware under W-NT Eval WNT.SUM Grading of WNT products ********************************************************************** This part of VTC "2001-10" test report evaluates the detailed results as given in section (file): 6GWNT.TXT Macro and Script Viruses/Malware results (W-NT) The following *22* products participated in this scanner test for WNT products: -------------------------------------------------------- Products submitted for aVTC test under Windows-NT: -------------------------------------------------------- ANT v(def): 6.8.0.2 sig: June 05,2001 AVA v(def): unknown sig: unknown AVG v(def): 6.0.263 sig: June 22,2001 AVK v(def): 10.0.167 sig: June 21,2001 AVP v(def): 3.5.133.0 sig: June 01,2001 AVX v(def): 6.1 sig: June 18,2001 CMD v(def): 4.61.5 sig: June 25,2001 DRW v(def): 4.25 sig: June 20,2001 FPR v(def): 3.09d sig: June 25,2001 FPW v(def): 3.09d sig: June 25,2001 FSE v(def): 1.00.1251 sig: June 20,2001 scan eng fprot: 3.09.507 scan eng avp: 3.55.3210 scan eng orion: 1.02.15 IKA v(def): 5.01 sig: June 25,2001 INO v(def): 6.0.85 sig: June 14,2001 MR2 v(def): 1.17 sig: June 25,2001 NAV v(def): 4.1.0.6 sig: June 22,2001 NVC v(def): 5.00.25 sig: June 19,2001 PAV v(def): 3.5.133.0 sig: June 23,2001 QHL v(def): 6.02 sig: June 28,2001 RAD v(def): 8.1.001 sig: June 25,2001 RAV v(def): 8.2.001, scan eng:8.3 sig: June 25,2001 SCN v(def): 4144 scan eng:4.1.40 sig: June 20,2001 VSP v(def): 12.22.1 sig: June 25,2001 -------------------------------------------------------- Eval WNT.01: Development of Scanner Detection Rates under Windows-NT: ==================================================================== The number of scanners running under Windows NT is growing. Evidently, AV producers invest now more work into the development of the W32-related platforms, and here into the detection of macro (and script) viruses whereas detection of file viruses degrades. The following table summarizes results of macro and Script virus detection under Windows-NT in last 9 VTC tests: Table WNT-A.2: Macro&Script Virus Detection Rate in last 9 VTC tests under W-NT: ================================================================================ Scan =========== Macro Virus Detection ================ + =Script Virus Detection= ner 9707 9802 9810 9903 9909 0004 0008 0104 0110 Delta I 0008 0104 0110 Delta ---------------------------------------------------------+------------------------- ANT 92.2 - 85.7 - 89.3 90.2 96.4 - 97.1 - I 55.2 - 81.8 - ANY - - 70.5 - - - - - - - I - - - - ATD - - - - - 99.9 - - - - I - - - - AVA - 91.9 97.2 95.2 93.3 94.3 94.1 95.7 97.7 +2.0 I 15.0 29.1 29.6 +0.5 AVG - - - 82.5 96.6 97.5 97.9 98.3 98.4 +0.1 I 45.8 57.9 62.9 +5.0 AVK - - 99.6 99.6 100% 99.9 100~ 100~ 100% 0.0 I 91.8 99.8 100% +0.2 AVP - - 100% 99.2 100% 99.9 100~ 100~ 100~ 0.0 I 88.2 99.8 100% +0.2 AVX - - - - - - 99.0 - 99.1 - I 61.4 - 70.1 - AW - 61.0 - - - - - - - - I - - - - CLE - - - - - - - - - - I 4.2 - - - CMD - - - - - 100% 100% 100% 100~ 0.0 I 93.5 96.9 93.9 -3.0 DRW/DWW - - - 98.3 98.8 98.4 97.5 - 99.5 - I 59.8 - 95.4 - DSS/E 99.0 100% 100% 100% - - - - - - I - - - - ESA - - - - - 88.9 - - - - I - - - - FPR/FMA - 99.9 99.8 99.8 99.7 - - 100% 100~ 0.0 I - 97.1 94.9 -2.2 FPW - - - - 99.7 100% 100% 100% 100~ 0.0 I 90.8 96.9 94.6 -2.3 FSE - - 99.9 100% 100% 100% 100% 100% 100% 0.0 I 96.7 100% 100% 0.0 FWN - - 99.6 99.7 - 99.9 - - - - I - - - - HMV - - 99.0 99.5 - - - - - - I - - - - IBM 92.9 92.6 98.6 * * * * * * * I * * * * IKA - - - - - - - - 95.4 - I - - 77.7 - INO - 89.7 - 99.8 99.7 99.7 99.8 99.7 99.9 +0.2 I 78.1 92.7 95.1 +2.4 IRS - 99.1 - 99.5 - - - - - - I - - - - IVB - - 92.8 95.0 - - - - - - I - - - - MKS - - - - - 97.1 - - - - I - - - - MR2 - - - - 69.6 - - - 0.7 - I - - 83.3 - NAV 95.6 98.7 99.9 99.7 98.7 98.0 97.7 97.0 99.5 +2.5 I 36.6 54.5 94.2 +39.7 NOD - - - 99.8 100% 99.4 - - - - I - - - - NVC 96.6 99.2 - 98.9 98.9 99.9 99.9 99.8 99.8 0.0 I 83.7 88.5 91.3 +2.8 NVN - - - - 99.5 - - - - - I - - - - PAV 93.5 98.8 99.5 99.4 99.7 99.9 100~ 100~ 100% 0.0 I 90.2 99.8 100% +0.2 PCC - 94.8 - - - - - - - - I - - - - PER - 91.0 - - - - 85.0 68.2 - - I 0.0 22.0 - - PRO - - - 58.0 61.9 67.4 69.1 67.1 - - I 13.1 35.8 - - QHL - - - - - 0.0 - 0.0 - I 6.9 - 0.2 - RAD - - - - - - - - 99.5 - I - - 82.5 - RAV - 98.9 99.5 99.2 - 97.9 96.9 99.6 99.5 -0.1 I 47.1 84.9 0.2 - RA7 - - - 99.2 - - - - - - I - - - - SCN 97.6 99.1 97.7 100% 100% 100% 100% 100% 100% 0.0 I 95.8 100% 99.8 -0.2 SWP 89.1 98.4 97.5 - 98.4 98.6 - - - - I - - - - TBA 96.1 - 98.7 * * * * * * * I * * * * TNT - - 44.4 * * * * * * * I * * * * VET - 94.0 - 94.9 * * * * * * I * * * * VSA - 84.4 - - - - - - - - I - - - - VSP - - - 86.7 0.3 0.0 - 0.0 0.0 0.0 I - 85.3 84.0 -1.3 ---------------------------------------------------------+-------------------------- Mean: 94.7 95.9 91.6 95.3 95,1 96.5 96.3 95.3 85.8 +0.3%I 57.7 78.9 78.7 +3.3% Without extreme results: (99.2) I (86.6) (+0.3) ---------------------------------------------------------+-------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Concerning macro viruses, "mean" detection rate is significantly reduced (almost 10%), essentially due to some (new) products with extremely low detection rates (<1%); when leaving out these (3) products, the mean detection rate is "excellent" (99.2%). Those (15) products which also participated in the last test succeeded to improve their detection rates slightly (+0.3%). Concerning script viruses which is presently the fastest growing sector, the detection rate is stable on very low level (78.7% mean). Again, this result is influenced by some "newcomers" with extremely low detection rates whose insuffient detection rate cannot be balanced by a surprisingly strong improvement of NAV (+35% detection to now 94.2%!). Without counting those extremely bad detectors, the mean detection rate would be 86.6%. Those (14) scanners which participated in last test succeeded to improve their detection rate again (+3.3%) but if one doesnot count the unusual large improvement of NAV, mean improvement is "normal" (+0.3%). *************************************************************************** Findings WNT.1: For W-NT, macro zoo virus detection is further improving. Out of 22 products: 4 products detect ALL macro zoo viruses and are rated "perfect": AVK,FSE,PAV,SCN 8 products detect >99% of macro zoo viruses: and are rated "excellent": AVP,AVX,CMD,DRW,FPR,FPW,INO,NAV 2 products detect >95% of macro zoo viruses and are rated "very good": AVA,AVG ----------------------------------------------------------- For W-NT, script virus detection needs further work although the development is promissing as: 4 products detect ALL script zoo viruses and are rated "perfect": AVK,AVP,FSE,PAV 1 products detects >99% of macro zoo viruses and is rated "excellent": SCN 2 products detect >95% of macro zoo viruses and are rated "very good": DRW,INO ************************************************************************** Overall: 3 WNT products now detect ALL zoo macro and script virus "perfectly" (100% detection rate): AVK,FSE,PAV 1 WNT product detects >99% of zoo macro and script viruses "excellently": SCN ************************************************************************** Eval WNT.02: In-The-Wild (Macro,Script) Detection under W-NT ============================================================ Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for macro and script viruses (it must be observed that detection and identification is not completely reliable). The following 9 WNT products (of 22) reach 100% for ITW file, macro and script virus detection in >99.9% files and are rated "perfect" in this category: ITW Viruses&Files ( Macro Script) ------------------------ "Perfect" WNT ITW scanners: AVK ( 100.0% 100.0%) AVP ( 100.0% 100.0%) AVX ( 100.0% 100.0%) DRW ( 100.0% 100.0%) FSE ( 100.0% 100.0%) INO ( 100.0% 100.0%) NAV ( 100.0% 100.0%) PAV ( 100.0% 100.0%) SCN ( 100.0% 100.0%) ------------------------ ********************************************************************* Findings WNT.2: 9 AV products (out of 22) detect ALL In-The-Wild macro and script viruses in > 99.9% files and are rated "perfect": AVK,AVP,AVX,DRW,FSE,INO,NAV,PAV,SCN ---------------------------------------------------- 13 products can be rated "perfect" concerning detection of ITW macro viruses: AVG,AVK,AVP,AVX,CMD, DRW,FPR,FPW,FSE,INO,NAV,PAV,SCN ---------------------------------------------------- 10 products can be rated "perfect" concerning detection of ITW script viruses: AVK,AVP,AVX,DRW,FSE,INO,NAV,NVC,PAV,SCN ********************************************************************* Eval WNT.03: Evaluation of overall W-NT AV detection rates (zoo,ITW) ==================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). Besides grading products in related categories according to their performance, it is interesting to compare how products developed. In comparison with previous results (VTC test "2000-04") and with respect to macro and script viruses, it is notified whether some product remained in the same category (=), improved into a higher category (+) or lost some grade (-). The following list indicates those scanners graded into one of the upper three categories, with macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100% both in viruses and files). Under W-NT, now 3 product reached 100% detection rate for file, macro and script viruses, both zoo and In-The-Wild, and could be rated "perfect" (last time, 0 product was rated "perfect"). And 2 scanners are graded "Excellent" (>99%), and 2 more scanners are rated "very good" (>95%). In general, low detection rates of script viruses (esp. including detection in ALL files) determine the overall grading: (zoo: macro/script; macro/script:ITW) ---------------------------------------------- "Perfect" W-NT scanners: AVK ( 100% 100% ; 100% 100% ) (+) PAV ( 100% 100% ; 100% 100% ) (+) FSE ( 100% 100% ; 100% 100% ) (+) ---------------------------------------------- "Excellent" W-NT scanners: AVP ( 100~ 100% ; 100% 100% ) (=) SCN ( 100% 99.8 ; 100% 100% ) (=) ---------------------------------------------- "Very Good" W-NT scanners: DRW ( 99.5 95.4 ; 100% 100% ) (+) INO ( 99.9 95.1 ; 100% 100% ) (+) ---------------------------------------------- FSE detects all script viruses but not in all files. ************************************************************** Findings WNT.3: 3 WNT products are overall rated "perfect": AVK,FSE,PAV 2 "excellent" overall scanners: AVP,SCN 2 "very good" overall scanners: DRW,INO ************************************************************** Eval WNT.04: Evaluation of detection by virus classes under W-NT: ================================================================== Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses; in all cases, 100% detection of viruses in >99.9% of files is required for a product to be graded. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. WNT.04.2 Grading the Detection of macro viruses under WNT ---------------------------------------------------------- "Perfect" WNT macro scanners: AVK (100.0%) FSE (100.0%) PAV (100.0%) SCN (100.0%) "Excellent" WNT macro scanners: AVP ( 100~ ) CMD ( 100~ ) FPR ( 100~ ) FPW ( 100~ ) INO ( 99.9%) NAV ( 99.5%) AVX ( 99.1%) "Very Good" WNT macro scanners: AVG ( 98.4%) WNT.04.3 Grading the Detection of Script viruses under WNT: ------------------------------------------------------------ "Perfect" WNT script scanners: AVK (100.0%) AVP (100.0%) PAV (100.0%) "Excellent" WNT script scanners: FSE (100.0%) (*) SCN ( 99.8%) "Very Good" WNT script scanners: INO ( 95.1%) (*) < 99.9% of files detected ***************************************************************** Finding WNT.4: Performance of WNT scanners by virus classes: --------------------------------------------- 4 "Perfect" scanners for macro zoo: AVK,FSE,PAV,SCN 7 "Excellent" scanners for macro zoo: AVP,CMD,FPR,FPW,INO,NAV,AVX -------------------------------------------------- 3 "Perfect" scanners for script zoo: AVK,AVP,PAV 1 "Excellent" scanner for script zoo: FSE,SCN ***************************************************************** Eval WNT.05: Detection of Packed Macro Viruses under W-NT ========================================================= Detection of macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with 6 popular methods (PKZIP, ARJ, LHA, RAR, WinRAR and CAB) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. We can report a very good result: while only 2 products detected last time all viruses packed with ALL 6 packers, now 8 (of 22) products detect ALL virus packed with ALL 6 packers. And 3 more (last time: 0) detect all viruses packed with 5 packers (in both classes 11 products after 1). A "perfect" product would detect ALL packed viral samples (100%) for all (6) packers: ------------------------------------------------------ "Perfect" packed virus detectors: AVK, AVP, AVX, CMD, FPR, FPW, PAV, SCN ------------------------------------------------------ An "excellent" product would reach 100% detection of packed viral samples for at least 5 packers: -------------------------------------------------------- "Excellent" packed macro virus detector: --- -------------------------------------------------------- A "very good" product would detect viral samples (ITW macro) for at least 4 packers: ------------------------------------------------------ "Very Good" packed macro virus detector: AVG, DRW, INO ------------------------------------------------------ Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ************************************************************************* Findings WNT.5: Detection of packed viral objects is improving 8 "Perfect" packed macro virus WNT detector: AVK, AVP, AVX, CMD, FPR, FPW, PAV, SCN 0 "Excellent" packed file macro detector: --- 2 Perfect packed macro virus detector: AVG, DRW ************************************************************************* Eval WNT.06: Avoidance of False Alarms (Macro) under W-NT: ========================================================== First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, 7 (out of 24) products in test reported NO SINGLE False Positive alarm in macro zoo testbeds and are therefore rated "perfect": ---------------------------------------------------------------- 7 "Perfect" FP avoiding WNT scanners: ANT, AVA, AVG, AVK, INO, RAD, SCN ---------------------------------------------------------------- (MR2,QHL and VSP avoid FPs, but at low level of virus detection) **************************************************************** Findings WNT.6: Avoidance of False-Positive Alarms is improving though still regarded insufficient. 7 FP-avoiding "perfect" W-NT scanners: ANT, AVA, AVG, AVK, INO, RAD, SCN **************************************************************** Eval WNT.07: Detection of Macro and Script Malware under W-NT ============================================================= Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" We can report that macro malware detection has now reached a rather good level as 19 (out of 22) products can be rated "perfect" (2), "excellent" (7) and "very good" (3). But for script malware, detection qualities must be improved. Concerning Macro AND Script malware detection: ----------------------------------------------------- 2 "Perfect" macro/script malware detectors under WNT: PAV ( 100% 100% ) SCN ( 100% 100% ) ----------------------------------------------------- "Excellent" macro/script malware detectors under WNT: Macro/Script AVK (99.8% 100.0%) AVP (99.8% 100.0%) FSE (99.8% 100.0%) ----------------------------------------------------- "Very Good" macro/script malware detectors under WNT: RAD (97.7% 81.1%) ----------------------------------------------------- Concerning only macro malware detection, several more products (with problems concerning script viruses detection) can be reported as "excellent" or "very good": --------------------------------------------------- "Perfect" macro malware detectors under WNT: PAV (100.0%) SCN (100.0%) --------------------------------------------------- "Excellent" macro malware detectors under WNT: AVK ( 99.8%) AVP ( 99.8%) FSE ( 99.8%) CMD ( 99.5%) FPR ( 99.5%) FPW ( 99.5%) NVC ( 98.8%) RAD ( 97.7%) RAV ( 97.4%) INO ( 93.4%) AVX ( 92.0%) DRW ( 90.8%) --------------------------------------------------- "Very Good" macro malware detectors under WNT: IKA ( 89.9%) ANT ( 88.7%) AVA ( 88.5%) NAV ( 86.4%) AVG ( 82.6%) --------------------------------------------------- And oncerning script malware detection only, detection rates are less acceptable and need improvement: --------------------------------------------------- 5 "Perfect" script malware detectors under WNT: AVK, AVP, FSE, PAV, SCN --------------------------------------------------- "Excellent" script malware detectors under WNT: --- --------------------------------------------------- "Very Good" script malware detectors under WNT: RAD --------------------------------------------------- ******************************************************************* Findings WNT.7: Macro&Script Malware detection under WNT is slowly improving. Macro malware detection is significantly better developped than script malware detection. --------------------------------------------------- Now, 2 macro&script detectors are "perfect":PAV,SCN --------------------------------------------------- Now, 9 products are rated "excellent" (>90%): AVK, AVP, CMD, FSE, FPR, FPW, NVC, RAD, AVX *************************************************** Concerning only macro malware detection, 2 products are rated "perfect": PAV, SCN And concerning macro malware detection only, 9 more products are rated "excellent": AVK, AVP, CMD, FSE, FPR, FPW, NVC, RAD, INO, AVX *************************************************** Concerning only script malware detection,5 products are rated "perfect": AVK, AVP, FSE, PAV, SCN And NO product is rated "excellent" (>90% detection). ****************************************************************** Eval WNT.SUM: Grading of WNT products: ====================================== Under the scope of VTCs grading system, a "Perfect WNT AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99.9% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ********************************************************************* In VTC test "2001-10", we found ** 1 perfect WNT AV product: AVK ********************************************** but we found **** No perfect WNT AM product **** ********************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- WNT zoo macro test: AVK,FSE,PAV,SCN AVP,AVX,CMD, FPR,FPW,INO,NAV WNT zoo script test: AVK,AVP,PAV FSE,SCN WNT ITW tests: AVK,AVP,AVX,DRW, FSE,INO,NAV,PAV,SCN WNT pack-tests: AVK,AVP,AVX,CMD,FPR, --- FPW,PAV,SCN WNT FP avoidance: ANT,AVA,AVG,AVK,INO, --- RAD,SCN ----------------------------------------------------------------- WNT Macro Malware Test: PAV,SCN AVK,AVP,CMD,FSE,FPR, FPW,NVC,RAD,INO,AVX WNT Script Malware Test: AVK,AVP,FSE,PAV,SCN --- ----------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this WNT test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" WNT AV product: AVK (10 points) ************************************************************ 2nd place: SCN ( 9 points) 3rd place: PAV ( 8 points) 4th place: AVP ( 7 points) 5th place: AVX,FSE,INO ( 5 points) 8th place: CMD,FPR,FPW,DRW,NAV ( 3 points) 13th place: ANT,AVA,AVG ( 2 points) 16th place: RAD/RAV ( 1 point ) ************************************************************ "Perfect" WNT AntiMalware product: =NONE= (14 points) "Excellent" WNT AntiMalware product: 1st place: AVK,SCN (13 points) 3rd place: PAV (12 points) 4th place: AVP (11 points) 5th place: FSE ( 8 points) 6th place: AVX,INO ( 6 points) 8th place: CMD,FPR,FPW ( 4 points) 11th place: RAD/RAD ( 2 points) ************************************************************