========================================= File 7EVALW98.TXT ----------------------------------------- Evaluation of results for Macro and Script Virus/Malware detection under Windows-98 in VTC Test "2001-10": ========================================= Formatted with non-proportional font (Courier) Content of this file: ************************************************************************ Eval W98: Development of detection rates under Windows-98: ************************************************************************ Eval W98.01: Development of W-98 Scanner Detection Rates Table W98-A2: Macro/Script Virus Detection Rate in last 7 VTC tests Eval W98.02: In-The-Wild Detection under W-98 Eval W98.03: Evaluation of overall W-98 AV detection rates (zoo,ITW) Eval W98.04: Evaluation of detection by virus classes under W-98 W98.04.2 Grading the Detection of macro viruses under W-98 W98.04.3 Grading the Detection of script viruses under W-98 Eval W98.05: Detection of Packed Macro Viruses under W-98 Eval W98.06: Evaluation of False Alarms (Macro) under W-98 Eval W98.07: Detection of Macro Malware under W-98 Eval W98.SUM Grading of W-98 products ************************************************************************ This part of VTC "2001-10" test report evaluates the detailed results as given in section (files): 6FW98.TXT File/Macro Viruses/Malware results W-98 The following *21* products participated in this scanner test for W-98 products: -------------------------------------------------------- Products submitted for aVTC test under Windows-98: -------------------------------------------------------- ANT v(def): 6.8.0.56 sig: June 22,2001 AVA v(def): 3.0.354.0 sig: June 25,2001 AVG v(def): 6.0.263 sig: June 22,2001 AVK v(def): 10.0.167 sig: June 21,2001 AVP v(def): 3.5.133.0 sig: June 01,2001 AVX v(def): 6.1 sig: June 18,2001 CMD v(def): 4.61.5 sig: June 25,2001 DRW v(def): 4.25 sig: June 20,2001 DSE v(def): 4.0.3 sig: June 20,2001 FPR v(def): 3.09d sig: June 25,2001 FPW v(def): 3.09d sig: June 25,2001 FSE v(def): 1.00.1251 sig: June 20,2001 scan eng fprot: 3.09.507 scan eng avp: 3.55.3210 INO v(def): 6.0.85 sig: June 14,2001 MR2 v(def): 1.17 sig: June 2001 NAV v(def): 4.1.0.6 sig: June 22,2001 NVC v(def): 5.00.25 sig: June 19,2001 PAV v(def): 3.5.133.0 sig: June 23,2001 QHL v(def): 6.02 sig: June 28,2001 RAV v(def): 8.2.001, scan eng:8.3 sig: June 25,2001 SCN v(def): 4144 scan eng:4.1.40 sig: June 20,2001 VSP v(def): 12.22.1 sig: June 25,2001 -------------------------------------------------------- Eval W98.01: Development of Scanner Detection Rates under Windows-98: ===================================================================== The following table summarizes results of file, macro and script virus detection under Windows-98 in last 7 VTC tests: Table W98-A2: Comparison: Macro/Script Virus Detection Rate in last 7 VTC tests under W-98: =========================================================================================== Scan ------------- Macro Virus Detection ------------ + --ScriptVirusDetection-- ner 98/10 99/03 99/09 00/04 00/08 01/04 01/10 DELTA I 00/08 01/04 01/10 DELTA -------------------------------------------------------+------------------------- ACU - 97.6 - - - - - - I - - - - ADO - - - - - 99.9 - - I - 99.8 - - AN5 - - 89.3 - - - - - I - - - - ANT 84.3 - 89.5 90.2 96.4 - 97.4 - I 55.2 - 99.1 - ANY 70.7 - - - - - - - I - - - - ATR - - - - - - - - I - 2.7 - - AVA/3 96.7 95.9 93.9 94.3 94.1 95.7 97.7 +2.0 I 15.0 30.0 33.7 +3.7 AVG - 82.5 96.6 97.5 97.9 98.3 98.4 +0.1 I - 57.9 62.9 +5.0 AVK 99.6 99.6 100.0 99.9 100~ 100~ 100% 0.0 I 91.2 99.8 100% +0.2 AVP 100.0 99.2 100.0 99.9 100~ 100~ 100~ 0.0 I 88.2 99.8 100% +0.2 AVX - - 98.7 94.5 99.0 - 99.1 - I 61.4 - 70.1 - CLE - - - - - 0.0 - - I 4.2 6.3 - - CMD - - 99.6 100.0 100% 100% 100~ 0.0 I 93.5 96.9 93.9 -3.0 DRW - 98.3 98.8 98.4 - 98.0 99.5 +1.5 I - 95.6 95.4 - DSE 100.0 100.0 * 100.0 100% 99.9 97.8 -2.1 I 95.8 100% 73.0 -27.0 ESA - - - 88.9 - - - - I - - - - FPR 92.4 99.8 99.7 100.0 - 100% 100~ 0.0 I - 96.9 94.6 -2.3 FPW - - 99.9 100.0 100% 100% 100~ 0.0 I 90.8 96.9 94.6 -2.3 FSE 100.0 100.0 100.0 100.0 100% 100% 100% 0.0 I 96.7 100% 100% 0.0 FWN 99.6 99.7 99.9 99.8 - - - - I - - - - HMV - 99.5 - - - - - - I - - - - IBM 94.5 * * * * * * * I * * * * INO 88.1 99.8 98.1 99.7 99.8 99.7 99.9 +0.2 I 78.1 92.7 95.1 +2.4 IRS 99.0 99.5 - - - - - - I - - - - ITM - - - - - - - - I - - - - IVB 92.8 95.0 - - - - - - I - - - - MKS - - - 97.1 - 44.2 - - I - - - - MR2 - - 64.9 - - - 40.8 - I - 85.1 83.3 -1.8 NAV 95.3 99.7 98.7 98.0 97.7 97.0 99.5 +2.5 I 36.6 65.5 94.2 +29.7 NOD - 99.8 100.0 99.4 - - - - I - - - - NV5 - - 99.6 - - - - - I - - - - NVC - 99.1 99.6 99.9 99.9 99.8 99.8 0.0 I 83.7 88.5 91.3 +2.8 PAV 99.5 99.5 86.7 99.9 100~ 99.5 100% +0.5 I 90.2 99.8 100% +0.2 PCC - 98.0 - - - - - - I - - - - PER - - - 53.7 67.2 68.5 - - I 18.0 22.0 - - PRO - 58.0 61.9 67.4 69.1 67.1 - - I 12.1 40.7 - - QHL - - - 0.0 - 0.0 0.0 0.0 I 6.9 - - - RAV 92.2 - 98.1 97.9 96.9 99.6 99.5 -0.1 I 47.1 84.9 82.5 -2.4 SCN 97.7 100.0 99.8 100.0 100% 100% 100% 0.0 I 95.8 100% 99.8 -0.2 SWP 98.6 - 98.5 98.6 - - - - I - - - - TBA 98.7 * * * * * * * I - - - - TSC - 76.5 64.9 - - - - - I - - - - VBS 41.5 - - - - - - - I - - - - VBW 93.4 - - - - - - - I - - - - VET - 97.6 * * * * - - I - - - - VSP - 0.4 0.3 - - 0.0 0.0 0.0 I - 85.3 84.0 -1.3 -------------------------------------------------------+------------------------- Mean 92.1 90.3 93.5 95.0 95.6 84.7 87.1% +0.3% I 61.0 76.0 86.5% +0.2% Without extreme low detectors: (96.3%) I -------------------------------------------------------+------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. The number of scanners in test has grown to 22. But as some new products reached very low detection rates, mean detection rates for macro and script virus detection are significantly reduced. Concerning macro viruses, "mean" detection rate is significantly reduced (esp. due to several products with very insufficient detection rates), but those products which participated in last test on a still acceptable level (96.3%) further improved their detection rates slightly (+0.3%). Concerning script viruses which is presently the fastest growing sector, the detection rate is significantly improved though still low (86.5% mean). Those scanners which participated in last test improved their detection rate further by 0.2% "in the mean". *************************************************************************** Findings W98.1: For W-98, macro and script zoo virus detection rates are in the mean increasing but still insufficient; all products rated MUST detect ITW viruses "perfectly" (100%). ----------------------------------------------------------- Concerning macro virus detection only: 4 products detect ALL macro zoo viruses and are rated "perfect": AVK,FSE,PAV,SCN 9 products detects >99% of macro zoo viruses and are rated "excellent": AVP,AVX,CMD,DRW,FPR,FPW,INO,NAV,NVC 3 products detect >95% of macro zoo viruses and are rated "very good": ANT,AVG ----------------------------------------------------------- Concerning script virus detection only, further work is required though the development is promissing: 4 products detect ALL script zoo viruses and are rated "perfect": AVK,AVP,FSE,PAV 1 product detects >99% of script zoo viruses and are rated "excellent": SCN 2 products detect >95% of script zoo viruses and are rated "very good": DRW,INO ************************************************************************** Overall: 3 W-98 products now detect ALL zoo macro and script virus "perfectly" (100% detection rate): AVK,FSE,PAV 2 products detect >99% of macro&script zoo viruses and are rated "excellent": AVP,SCN 2 products detect >95% of macro&script zoo viruses and are rated "very good": DRW,INO ************************************************************************** Eval W98.02: In-The-Wild (Macro,Script) Detection under W-98 ============================================================ Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for macro and script viruses (it must be observed that detection and identification is not completely reliable). The following 5 W-98 products (of 21) reach 100% for ITW macro and script virus detection for all objects and are rated "perfect" in this category (alphabetically ordered): ITW Viruses&Files ( Macro Script) ------------------------ "Perfect" W-98 ITW scanners: AVK ( 100.0% 100.0%) AVP ( 100.0% 100.0%) AVX ( 100.0% 100.0%) DRW ( 100.0% 100.0%) FSE ( 100.0% 100.0%) INO ( 100.0% 100.0%) NAV ( 100.0% 100.0%) PAV ( 100.0% 100.0%) SCN ( 100.0% 100.0%) ------------------------ ************************************************************** Findings W98.2: 9 AV products (out of 21) detect ALL In-The-Wild macro & script viruses in >99.9% files and are rated "perfect ITW scanners": AVK,AVP,AVX,DRW,FSE,INO,NAV,PAV,SCN --------------------------------------------- Concerning ITW macro virus detection, 13 products detect ALL viruses in >99.9% files: AVG,AVK,AVP,AVX,CMD,DRW, FPR,FPW,FSE,INO,NAV,PAV,SCN --------------------------------------------- Concerning ITW script virus detection, 10 products detect ALL viruses in >99.9% files: AVK,AVP,AVX,DRW,FSE,INO,NAV,NVC,PAV,SCN ************************************************************** Eval W98.03: Evaluation of overall W-98 AV detection rates (zoo,ITW) ==================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). Besides grading products in related categories according to their performance, it is interesting to compare how products developed. In comparison with previous results (VTC test "2000-04") and with respect to macro and script viruses, it is notified whether some product remained in the same category (=), improved into a higher category (+) or lost some grade (-). The following list indicates those scanners graded into one of the upper three categories, with macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Under W-98, now 1 product (before: 0) reached 100% detection rate for macro and script viruses, both zoo and In-The-Wild (there for ALL files), and is rated "perfect". 2 scanners are graded "Excellent" (>99%), and 1 more scanner is rated "very good" (>95%): (zoo: macro/script; macro/script:ITW) -------------------------------------------------- "Perfect" W-98 zoo scanners: AVK ( 100% 100% ; 100% 100% ) (+) FSE ( 100% 100% ; 100% 100% ) (+) PAV ( 100% 100% ; 100% 100% ) (+) -------------------------------------------------- "Excellent" W-98 zoo scanners: AVP ( 100~ 100% ; 100% 100% ) (=) SCN ( 100% 99.8 ; 100% 100% ) (=) -------------------------------------------------- "Very Good" W-98 zoo scanners: DRW ( 99.5 95.4 ; 100% 100% ) (=) INO ( 99.0 95.1 ; 100% 100% ) (+) -------------------------------------------------- * Product detects all ITW script viruses but not in all files. ********************************************************************** Findings W98.3: Now, 3 W98 products are overall "perfect": AVK,FSE,PAV 3 products are rated "excellent": AVP,SCN 2 products are rated "very good": DRW,INO ********************************************************************** Eval W98.04: Evaluation of detection by virus classes under W-98: ================================================================= Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses; in all cases, 100% detection of viruses in all files is required for a product to be graded. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. W98.04.2 Grading the Detection of macro viruses under W-98: ------------------------------------------------------------ macro zoo / ITW ---------------- "Perfect" macro zoo products: AVK ( 100.0% 100.0%) FSE ( 100.0% 100.0%) PAV ( 100.0% 100.0%) SCN ( 100.0% 100.0%) "Excellent" macro zoo products: AVP ( 100~ 100.0%) CMD ( 100~ 100.0%) FPR ( 100~ 100.0%) FPW ( 100~ 100.0%) INO ( 99.9% 100.0%) DRW ( 99.5% 100.0%) NAV ( 99.5% 100.0%) AVX ( 99.1% 100.0%) "Very Good" macro zoo product: AVA ( 97.7% 100.0%) ---------------- W98.04.3 Grading the Detection of Script viruses under W-98: ------------------------------------------------------------ script zoo / ITW ---------------- "Perfect" script zoo products: AVK ( 100.0% 100.0%) AVP ( 100.0% 100.0%) FSE ( 100.0% 100.0%) PAV ( 100.0% 100.0%) "Excellent" script zoo products: SCN ( 99.8% 100.0%) "Very Good" script zoo product: DRW ( 95.4% 100.0%) INO ( 95.1% 100.0%) ---------------- ******************************************************************* Finding W98.4: Performance of W-98 scanners by virus classes: --------------------------------------------- Concerning overall macro virus detection: 4 products are rated "perfect": AVK,FSE,PAV,SCN 8 products are arted "excellent": AVP,CMD,FPR,FPW,INO,DRW,NAV,AVX 1 product is rated "very good": AVA ---------------------------------------------------- Concerning overall script virus detection: 2 products are rated "perfect": AVK,AVP 3 products are rated "excellent": FSE,PAV,SCN 2 products are rated "very good": INO,DRW ****************************************************************** Eval W98.05: Detection of Packed Macro Viruses under W-98 ========================================================= Detection of macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with 6 popular methods (PKZIP, ARJ, LHA, RAR, WinRAR and CAB) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. A "perfect" product would detect ALL packed macro samples (100%) in all files for all (6) packers: --------------------------------------------- "Perfect" packed virus detectors: AVK,AVP,CMD,FPR,FPW,PAV,SCN --------------------------------------------- An "excellent" product would reach 100% detection of packed macro viruses for at least 5 packers: ----------------------------------------------- "Excellent" packed macro virus detector: --- ----------------------------------------------- A "very good" product would detect viral samples (ITW macro) for at least 4 packers: ---------------------------------------------------- "Very Good" packed macro virus detector: AVG,DRW,INO ---------------------------------------------------- Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ************************************************************************* Findings W98.5: Progress in detection of packed viral objects. 7 "Perfect" packed macro virus W98 detector: AVK,AVP,CMD,FPR,FPW,PAV,SCN 0 "Excellent" packed macro virus W98 detectors: --- 3 "Very Good" packed macro virus W98 detectors: AVG,DRW,INO ************************************************************************* Eval W98.06: Avoidance of False Alarms (Macro) under W-98: ========================================================== First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, 12 (out of 22) products in test reported NO SINGLE False Positive alarm both in file and macro zoo testbeds and are therefore rated "perfect": ----------------------------------------------------------------- "Perfect" FP avoiding W-98 scanners: AVA,AVG,AVK,DSE,INO,RAV,SCN ----------------------------------------------------------------- Remark: also QHL and VSP had no false alarm though on a very low level of detection. ****************************************************************** Findings W98.6: Avoidance of False-Positive Alarms needs further work as many scanners have significant alarm rates. 7 FP-avoiding perfect W-98 scanners: AVA,AVG,AVK,DSE,INO,RAV,SCN ****************************************************************** Eval W98.07: Detection of File and Macro Malware under W-98 =========================================================== Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Presently, now 2 products (last time: NONE) "perfectly" detect all (non-viral) macro and script malware in VTC testbeds: Concerning Macro AND Script malware detection: ------------------------------------------------------ "Perfect" macro/script malware detectors under W-98: macro/script PAV (100.0% 100.0%) SCN (100.0% 100.0%) "Excellent" detectors: FSE ( 99.8% 100.0%) AVK ( 99.8% 100.0%) AVP ( 99.8% 100.0%) "Very Good" detectors : RAV ( 97.7% 81.8%) DSE ( 85.0% 81.8%) ------------------------------------------------------ Concerning macro malware detection only, 2 products are rated "perfect", and 9 more reach grade "excellent": --------------------------------------------------- "Perfect" macro malware detectors under W-98: PAV (100.0%) SCN (100.0%) "Excellent" macro malware detectors: FSE ( 99.8%) AVK ( 99.8%) AVP ( 99.8%) CMD ( 99.5%) FPR ( 99.5%) FPW ( 99.5%) NVC ( 98.4%) RAV ( 97.7%) INO ( 93.4%) AVX ( 92.0%) DRW ( 90.8%) "Very Good" macro malware detectors: ANT ( 89.4%) AVA ( 89.0%) NAV ( 86.4%) DSE ( 85.0%) AVG ( 82.6%) --------------------------------------------------- Concerning script malware detection only, 5 products are rated "perfect", and 2 more reach grade "very good": --------------------------------------------------- "Perfect" script malware detectors under W-98: AVK (100.0%) AVP (100.0%) FSE (100.0%) PAV (100.0%) SCN (100.0%) "Excellent" script malware detectors under W-98: --- "Very Good" script malware detectors under W-98: DSE ( 81.8%) RAV ( 81.8%) ----------------------------------------------------- ************************************************************************** Findings W98.7: Macro/Script Malware detection under W-98 is slowly improving. 2 macro/script malware detectors "perfect": PAV,SCN 3 macro/script malware detectors "excellent": FSE,AVK,AVP 2 macro/script malware detectors "very good": RAV,DSE ********************************************************** Macro malware detection is significantly improving: 2 macro malware detectors "perfect": PAV,SCN 11 macro malware detectors "excellent": AVK,AVP,CMD,FPR,FPW,FSE,NVC,RAV,INO,AVX,DRW 5 macro malware detectors "very good":ANT,AVA,NAV,DSE,AVG ********************************************************** Script malware detection is underdelopped: 5 script malware detectors "perfect": AVK,AVP,FSE,PAV,SCN 0 script malware detectors "excellent": --- 2 script malware detectors "very good": DSE,RAV ************************************************************************* Eval W98.SUM: Grading of W-98 products: ======================================= Under the scope of VTCs grading system, a "Perfect W-98 AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99.9% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ********************************************************************* In VTC test "2001-10", we found ** 1 perfect W98 AV product: AVK ********************************************** but we found **** No perfect W98 AM product **** ********************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" -------------------------------------------------------------------- W98 zoo macro test: AVK,FSE,PAV,SCN AVP,AVX,CMD,DRW, FPR,FPW,INO,NAV W98 zoo script test: AVK,AVP FSE,PAV,SCN W98 ITW tests: AVK,AVP,AVX,DRW, ----- FSE,INO,NAV,PAV,SCN W98 pack-tests: AVK,AVP,CMD,FPR,FPW,PAV,SCN ----- W98 FP avoidance: AVA,AVG,AVK,DSE,INO,RAV,SCN ----- ----------------------------------------------------------------- W98 Macro Malware Test: PAV,SCN FSE,AVK,AVP,CMD,FPR, FPW,NVC,RAV,INO,AVX,DRW W98 Script Malware Test: AVK,AVP,FSE,PAV,SCN ----- ----------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this W-98 test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" W-98 AntiVirus product: 1st place: AVK (10 points) ************************************************************ "Excellent" W-98 AntiVirus products: 2nd place: SCN ( 9 points) 3rd place: AVP,PAV ( 7 points) 5th place: FSE,INO ( 5 points) 7th place: AVX,CMD,DRW,FPR,FPW,NAV ( 3 points) 13th place: AVA,AVG,DSE,RAV ( 2 points) ************************************************************ "Perfect" W-98 AntiMalware product: =NONE= (14 points) "Excellent" W-98 AntiMalware products: 1st place: AVK,SCN (13 points) 3rd place: PAV (11 points) 4th place: AVP (10 points) 5th place: FSE ( 8 points) 6th place: INO ( 6 points) 7th place: AVX,CMD,DRW,FPR,FPW ( 4 points) 12th place: RAV ( 3 points) ************************************************************