=================================================== File 6ASUMOV.TXT: Overview: Results of VTC Test 1999-03 (March 1999): =================================================== Formatted with non-proportional font (Courier) Content of this file: ===================== 1) General/Background 2) Special problems/experiences in test "1999-03" 3) Results and evaluation 4) Overview Tables Table A1: Detection Rate of File/Boot/Macro Viruses in Full Test (DOS) Table AA: Comparison: File/Macro Virus Detection Rate in last 5 VTC tests (DOS) Table A2: Detection Rate of Boot/File/Macro Viruses in In-The-Wild Test (DOS) Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents) Table A4: Consistency/Reliability of Virus Detection in Full Test (DOS) Table A5: Detection Rate of Non-Viral File+Macro Malware Table AB: Comparison: File/Macro Virus Detection Rate in last 2 VTC tests under Windows 98 Table A6: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows 98) Table AC: Comparison: File/Macro Virus Detection Rate in last 4 VTC tests under Windows NT Table A7: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows NT) 1) General/Background: ====================== The test presented here works on the shoulder of previous VTC tests performed by Vesselin Bontchev (his last test "1994-07" is available, for comparison, from another entry in our ftp site) and regular VTC tests published since 1997 (07), and it is an upgrade of VTCs last test "1999-10" published in October 1998. For details of previous tests, see VTCs homepage: http://agn-www.informatik.uni-hamburg.de/vtc Concerning operating platforms, tests are performed for AV/AM products running under DOS, Windows 98 and Windows-NT, respectively. As in previous tests, VTC tested on-demand AntiVirus products under DOS for their ability to detect boot, file and macro viruses in the resp. virus databases. File and macro virus detection (both in full databases as in "In-the-Wild" testbeds) was tested for scanner versions running under Windows 98 and Windows NT. For these tests, virus databases were signifi- cantly updated against last tests. A separate test was performed on multiple generations of 4 polymorphic file viruses; this is a subset generated by VTCs dynamic test procedure for detection of polymorphic viruses. Another separate test is based on file viruses generated by the VKIT file virus generator. Further to macro and file malware tests, this test included also determining the quality of detection of viruses in objects packed with four popular scanners, and it analysed the ability of AV products to avoid "false positive" alarms. The protocols produced by each scanner were analysed for indications how many viruses and infected objects were detected, and whether identifi- cation is consistent (that is: producing the same name for all infected files, images and documents, respectively) and reliable (detecting all infected files, images and documents). Databases of file, boot and macro viruses were frozen on their status as known and available to VTC on November 30, 1998. For a detailed index of the respective virus databases, see: A3TSTBED.ZIP). Following discussions about virality of all samples in boot and file virus databases, there was a careful cleaning process where all samples of potentially questionable virality were moved to a special database from where self-reproduction experiments were started. We also received some information from experts of tested products which lead us to move few more samples to this "doubtful" database for further inspection. No questions or doubts were raised about the macro virus testbed. It is VTCs policy to move samples of questionable virality or malicity back to the related testbeds when we can prove definitively that they are viral or otherwise malicious. After a series of pre-tests during which test procedures were installed (esp. for products with new engines and those participating for the first time), final tests were performed on available products received before January 17, 1999. Results were included for the most actual scanner and signature version each. 2) Special problems/experiences in test "1999-03": ================================================== Again, the compexity of the test and problems with scanners and systems delayed publication for about 4 weeks. We experienced the following specific problems: Problem #1: During preparation of and test runs using the very large file virus database, it was observed that system support of managing files and directories produced unforeseen results. When moving larger portions of files and directories (esp. with Explorer under Windows-NT), files were partly not moved and even partly lost. During tests, it was observed (esp. when scanners ran in Win-NT`s DOS box) that not all directories had been scanned. During tests, it became evident that NTFS (and also FAT) didnot work as reliably as assumed. Similar observations have also been reported by other experts. This problem could only be overcome by extreme care and work upon comparing which files/directories had been touched or which had been suitably processed by each scanner. After detailed quality analysis, we are confident that test results are of "good quality". Problem #2: Just a "normal" problem is that a growing number of AV products does not behave "well" (conforming with VTC test requirements) on our growing databases. Some products even stopped scanning for unforeseen reasons. To avoid crashes on the very large file zoo testbed, we partitioned this testbed into about 30 partitions. When some product crashed on one partition, we tried to persue the test either onn the rest on that partition or, if the product continued to crash, stated scanning the next par- tition. This care costed much time and human effort. Problem #3: This test contained 33 AV products, and it is probably the largest test ever. Counting the 3 platforms, the number of products and the partitioning of testbeds, the test required about 6,000 single test runs. These factors (together with the heavy load of this report`s prime author following from his engagement in Y2K projects, as a consequence of his early involvement in Y2K problem analysis with his first contribution dating back to March 1992) summed up to delay the publication from end-February until mid-April 1999. 3) Results and evaluation: ========================== For details of results on scanner tests under DOS, Windows 98 and Windows NT, see test documents 6B-6F. For a detailed analysis of performances, see 7EVAL.TXT. All documents are in ASCII, formatted (and best reproducible) with non-proportiponal fonts (Courier), 72 columns. 4) Overview Tables: =================== In order to extract optimum information from the test results, comparative tables were produced for essential sets both for "full (zoo) test" as well as for the subset regarded equivalent to Joe Well`s and Wildlist Organisation`s "In-The-Wild" list. Result tables are given in ASCII, in a form from which an EXCEL or LOTUS spreadsheet can simply be derived, by simply deleting the headline and importing the "pure" table into a related spreadsheet. VTC deliberately did NOT follow suggestions to (optionally) present the results in XLS form, to avoid ANY possible macro-viral side effect (e.g. when some mirror site inadvertantly implants an XLS virus during itīs pre- processing, e.g. when adding some information on the mirror site). In order to determine whether and to what degree AntiVirus products also help users to identify non-replicating malicious software such as trojan horses, virus generators and intended (though not replicating) viruses, a special test was performed to detect known non-viral malware related to macro and file viruses. Macro malware was selected as published in VTCs monthly "List of Known Macro Viruses". For this test, known network malware (e.g. worms, hostile applets and malicious ControlX) were deliberately excluded, but will be incorporated into a future malware test). Different from previous tests (where we accepted not to publish malware detection results for those products where their producers requested abstention), related malware tests are from now on a mandatory part of VTC tests. We regret that we cannot admit AV products whose developpers are evidently not willing to protect their customers from such threats, or who forbid that a related capability of their product be tested by an independent institution. Fortunately, all relevant AV producers agreed with related VTC test conditions. Detailed results are collected in separate files: Operating system = DOS: ----------------------- 6bdosfil.txt: Detection Rates of File Viruses (full/ITW) and File Malware, as well as packed objects 6cdosboo.txt: Detection Rates of Boot Viruses (full/ITW) 6ddosmac.txt: Detection Rates of Macro Viruses (full/ITW) and Macro Malware, as well as packed objects Operating system = Windows 98: ------------------------------ 6ewin98.txt: Detection Rates of File/Macro Viruses (full/ITW) Operating system = Windows NT: ------------------------------ 6fwinnt.txt: Detection Rates of File/Macro Viruses (full/ITW) This text contains the following tables to give an overview of important VTC test "1999-03" results: Table A1: Detection Rate of File/Boot/Macro Viruses in Full Test (DOS) Table AA: Comparison: File/Macro Virus Detection Rate in last 5 VTC tests (DOS) Table A2: Detection Rate of Boot/File/Macro Viruses in In-The-Wild Test (DOS) Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents) Table A4: Consistency/Reliability of Virus Detection in Full Test (DOS) Table A5: Detection Rate of Non-Viral File+Macro Malware Table AB: Comparison: File/Macro Virus Detection Rate in last 2 VTC tests under Windows 98 Table A6: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows 98) Table AC: Comparison: File/Macro Virus Detection Rate in last 4 VTC tests under Windows NT Table A7: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows NT) Much more information is available from detailed tables, including detection of virus related to goat files, images and documents (see related chapters, as referred to in "1CONTENT.TXT"). ------------------------ Overview Table A1: ----------------------- Table A1 contains detection rates for file and macro viruses For all products tested, including those where several subsequent versions were received during test period. Table A1: Detection Rate of File/Boot/Macro Viruses in Full DOS Test: ===================================================================== Scanner Boot Viruses File Viruses Macro Viruses ------------------------------------------------------ Testbed 1197 100.0 17148 100.0 2874 100.0 ------------------------------------------------------ ACU - -- - -- - -- AVA 1141 95.3 16739 97.6 2757 95.9 AVG 977 81.6 14942 87.1 2371 82.5 AVK 1185 99.0 12856 75.0 2863 99.6 AVP 1175 98.2 17089 99.7 2869 99.8 DRW 1118 93.4 16837 98.2 2825 98.3 DSS 1186 99.1 17109 99.8 2874 100.0 FPR/FMA 1106 92.4 16928 98.7 2868 99.8 FSE 1185 99.0 16729 97.6 2863 99.6 HMV - -- - -- 2860 99.5 INO 1157 96.7 16822 98.1 2867 99.8 IRS 1093 91.3 8850 51.6 2560 89.1 ITM 738 61.7 11011 64.2 2038 70.9 NAV 1129 94.3 13230 77.2 2865 99.7 NOD 1186 99.1 16612 96.9 2869 99.8 NVC 1171 97.8 16741 97.6 - -- PAV 1185 99.0 12632 73.7 2859 99.5 PRO 339 28.3 6091 35.5 2341 81.5 RAV - -- - -- 2851 99.2 SCN 1015 84.8 17113 99.8 2874 100.0 TSC 744 62.2 6772 39.5 2200 76.5 VET 144 12.0 11202 65.3 2804 97.6 VSP 705 58.9 12289 71.7 - -- ------------------------------------------------------ Explanation of the different columns: 1) "Scanner Codename" is the code name of the scanner as listed in file A2SCANLS.TXT. 2) "Number of File Viruses (%)" is the number of different file infecting *viruses* in the virus collection used during the tests, which have been detected by the particular scanner. Their percentage from the full set of viruses in the collection used for tests is given in parenthesis. We define two viruses as being different if they differ in at least one bit in their non-modifiable parts. For variably encrypted viruses, the virus body has to be decrypted before the comparison is to be performed. For polymorphic viruses, additionally the part of the virus which is modified during the replication process has to be ignored. 3) Number of Boot Viruses (%)" as in (2) but for boot/DBR infectors. 4) "Number of Macro Viruses (%)" is the number of different macro *viruses* from the collection used for the test that the scanner detects. This field is analogous to field 2, only it lists macro viruses, not file infecting viruses. Several scanners were no longer available for testing, due to "market processes". Such products (IBM, TBA and TNT) are marked with "*". ------------------------ Overview Table AA: --------------------- Table AA indicates, for those scanners (most actual versions each time) in last tests (1997-02/07 versus 1998-10), how the results of file and macro virus detection rates developped. Results of these tests are given, and difference DELTA between last 2 tests is calculated. A "+"-sign indicates that the resp. scanner improved in related category, whereas a "-"-sign indicates that the present result is not as good as the previous one. Results of +-0.5% are regarded as "statistical" as this may depend upon differences in signature updates. In some cases, comparison is impossible due to problems in previous or present test. Table AA: Comparison: File/Macro Virus Detection Rate in last 5 VTC tests under DOS: ===================================================== ------- File Virus Detection ------- ------ Macro Virus Detection ------- SCAN 97/02 97/07 98/02 98/10 99/03 DELTA 97/02 97/07 98/02 98/10 99/03 DELTA NER % % % % % % % % % % % % ------------------------------------------------------------------------------ ALE 98.8 94.1 89.4 - - - 96.5 66.0 49.8 - - - AVA 98.9 97.4 97.4 97.9 97.6 -0.3 99.3 98.2 80.4 97.2 95.9 -1.3 AVG 79.2 85.3 84.9 87.6 87.1 -0.5 25.2 71.0 27.1 81.6 82.5 +0.9 AVK - - - 90.0 75.0 -15.0 - - - 99.7 99.6 -0.1 AVP 98.5 98.4 99.3 99.7 99.7 0.0 99.3 99.0 99.9 100.0 99.8 -0.2 ANT 73.4 80.6 84.6 75.7 - - 58.0 68.6 80.4 56.6 - - DRW 93.2 93.8 92.8 93.1 98.2 +5.1 90.2 98.1 94.3 99.3 98.3 -1.0 DSS 99.7 99.6 99.9 99.9 99.8 -0.1 97.9 98.9 100.0 100.0 100.0 0.0 FMA - - - - - - 98.6 98.2 99.9 - - - FPR 90.7 89.0 96.0 95.5 98.7 +3.2 43.4 36.1 99.9 99.8 99.8 0.0 FSE - - 99.4 99.7 97.6 -2.1 - - 99.9 90.1 99.6 +9.5 FWN - - - - - - 97.2 96.4 91.0 85.7 - - HMV - - - - - - - - 98.2 99.0 99.5 +0.5 IBM 93.6 95.2 96.5 - * * 65.0 88.8 99.6 - * * INO - - 92.0 93.5 98.1 +4.6 - - 90.3 95.2 99.8 +4.6 IRS - 81.4 74.2 - 51.6 - - 69.5 48.2 - 89.1 - ITM - 81.0 81.2 65.8 64.2 -1.6 81.8 58.2 68.6 76.3 70.9 -5.4 IVB 8.3 - - - 96.9 - - - - - - - NAV 66.9 67.1 97.1 98.1 77.2 -20.9 80.7 86.4 98.7 99.8 99.7 -0.1 NOD - - - 96.9 - - - - - - 99.8 - NVC 87.4 89.7 94.1 93.8 97.6 +3.8 13.3 96.6 99.2 90.8 - - PAN - - 67.8 - - - - - 73.0 - * - PAV - 96.6 98.8 - 73.7 - - 93.7 100.0 - 99.5 - PCC - - - - - - - 67.6 - - - - PCV 67.9 - - - - - - - - - - - PRO - - - - 35.5 - - - - - 81.5 - RAV - - - 71.0 - - - - - 99.5 99.2 -0.3 SCN 83.9 93.5 90.7 87.8 99.8 +12.0 95.1 97.6 99.0 98.6 100.0 +1.4 SWP 95.9 94.5 96.8 98.4 - - 87.4 89.1 98.4 98.6 - - TBA 95.5 93.7 92.1 93.2 * * 72.0 96.1 99.5 98.7 * * TSC - - 50.4 56.1 39.5 -16.6 - - 81.9 17.0 76.5 +59.5 TNT 58.0 - - - * * - - - - * * VDS - 44.0 37.1 - - - 16.1 9.9 8.7 - - - VET - 64.9 - - 65.3 - - 94.0 97.3 97.5 97.6 +0.1 VRX - - - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - - - VHU 19.3 - - - - - - - - - - - VSA - - 56.9 - - - - - 80.6 - - - VSP - - - 76.1 71.7 -4.4 - - - - - - VSW - - 56.9 - - - - - 83.0 - - - VTR 45.5 - - - - - 6.3 - - - - - XSC 59.5 - - - - - - - - - - - ------------------------------------------------------------------------------- Mean 74.2% 84.8% 84.4% 85.4% 81.2% -2.2% 69.6% 80.9% 83.8% 89.6% 93,6% +4,3% ------------------------------------------------------------------------------- Explanation of new column: 5) DELTA (=Change) is the relative difference between results of test "1998-10" and "1999-03". ------------------------ Overview Table A2: --------------------- Table A2 indicates how many viruses belonging to the "In-The-Wild" subset of the full virus databases have been found by the respective scanner. Optimum measure is 100%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A2: Detection Rate of Boot/File/Macro Viruses in DOS-ITW Test: ==================================================================== Scanner Boot Viruses File Viruses Macro Viruses ------------------------------------------------------ Testbed 76 100.0 87 100.0 83 100.0 ------------------------------------------------------ AVA 75 98.7 87 100.0 83 100.0 AVG 76 100.0 87 100.0 83 100.0 AVK 76 100.0 69 79.3 83 100.0 AVP 76 100.0 87 100.0 83 100.0 DRW 75 98.7 87 100.0 83 100.0 DSS 76 100.0 87 100.0 83 100.0 FPR 76 100.0 87 100.0 83 100.0 FSE 76 100.0 87 100.0 83 100.0 HMV -- -- -- -- 83 100.0 INO 76 100.0 87 100.0 83 100.0 IRS 76 100.0 72 82.8 77 92.8 ITM 74 97.4 84 96.6 82 98.8 NAV 75 98.7 80 92.0 83 100.0 NOD 76 100.0 87 100.0 83 100.0 NVC 76 100.0 87 100.0 -- -- PAV 76 100.0 68 78.2 83 100.0 PRO 50 65.8 59 67.8 76 91.6 RAV -- -- -- -- 83 100.0 SCN 76 100.0 87 100.0 83 100.0 TSC 73 96.1 70 80.5 74 89.2 VET 5 6.6 81 93.1 83 100.0 VSP 48 63.2 69 79.3 -- -- ------------------------------------------------------ ------------------------ Overview Table A3: --------------------- Table A3 indicates how many infected objects (files, boot/MBR images, Word and EXCEL documents) have been found by the respective scanner in the full database. Optimum measure is 100%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents): ================================================ -------- Number of objects infected: ---- Scanner Boot Viruses File Viruses Macro Viruses ------------------------------------------------------- Testbed 4746 100.0 128534 100.0 7765 100.0 ------------------------------------------------------- AVA 4146 87.4 125823 97.9 7504 96.6 AVG 3898 82.1 115759 90.1 6390 82.3 AVK 4714 99.3 100188 77.9 7747 99.8 AVP 4628 97.5 128000 99.6 7758 99.9 DRW 4169 87.8 126776 98.6 7674 98.8 DSS 4715 99.3 128187 99.7 7765 100.0 FPR 4467 94.1 127962 99.6 7743 99.7 FSE 4714 99.3 124741 97.0 7747 99.8 HMV -- -- -- -- 7726 99.5 INO 4312 90.9 126064 98.1 7745 99.7 IRS 4242 89.4 72613 56.5 6816 87.8 ITM 2909 61.3 78024 60.7 5314 68.4 NAV 4266 89.9 104276 81.1 7741 99.7 NOD 4715 99.3 125436 97.6 7750 99.8 NVC 4625 97.5 126097 98.1 -- -- PAV 4714 99.3 97899 76.2 7743 99.7 PCC -- -- -- -- -- -- PRO 1138 24.0 49629 38.6 6495 83.6 RAV -- -- -- -- 7724 99.5 SCN 4165 87.8 128101 99.7 7765 100.0 TSC 2899 61.1 53697 41.8 6058 78.0 VET 442 9.3 87520 68.1 7629 98.2 VSP 2323 48.9 84489 65.7 -- -- ------------------------------------------------------- Explanation of the different columns (see also 1-5 at Table 1/2): 6) "Number (%) of objects infected with file viruses" is the number of *files* infected with file-infecting viruses from the test set, which are detected by that particular scanner as being infected. Percentage of those files from the full set of files is given in parenthesis. We often have more than one infected file per virus, but not all viruses are represented by the same number of files, so this number does not give a good impression of the real detection rate of the scanner. It is included here only for completeness. Of course, it still *does* provide some information - usually the better a scanner is, the more files it will detect as infected. 7) "Number (%) of objects infected with boot viruses" is the number of infected boot sectors in the test set that the scanner detects as infected. This field is analogous to field 5, though it lists infected boot sectors, not files. 8) "Number of objects infected with macro viruses" is the number of infected documents in the test set that the scanner detects as infected. This field is analogous to field 5, though it lists infected documents, not files. ------------------------ Overview Table A4: --------------------- Table A4 provides information about the "quality" of detection. Inconsistent or unreliable detection means that some virus is identified with different names in different objects belonging to the same virus. Unreliable detection means that some virus is identified at least once, though not in all objects infected with the related virus. Optimum measures both for inconsistency and unreliability are 0%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A4: Consistency/Reliability of Virus Detection in Full DOS Test: ====================================================================== Scanner Unreliable Identification: Unreliable Detection: Boot(%) File(%) Macro(%) Boot(%) File(%) Macro(%) --------------------------------------------------------------- AVA 2.6 4.0 0.8 8.1 0.8 0.2 AVG 2.0 3.2 0.3 1.4 1.7 0.5 AVK 4.3 1.9 1.8 0.4 0.0 0.0 AVP 4.2 2.2 1.8 5.6 0.3 0.0 DSS 2.2 3.3 0.3 0.4 0.2 0.0 FPR 0.7 0.4 0.1 1.6 0.3 0.1 FSE 4.3 2.3 1.8 0.4 0.3 0.0 HMV -- -- 1.0 -- -- 0.3 INO 0.0 3.1 1.6 9.9 0.7 0.1 IRS 2.3 0.2 0.9 3.2 3.0 1.3 ITM 0.8 2.4 4.8 1.8 3.6 2.1 NAV 2.9 0.0 0.0 2.1 1.5 0.1 NOD 9.9 10.5 1.1 0.4 1.3 0.1 NVC 1.8 6.5 -- 1.1 1.6 -- PAV 4.3 1.8 1.8 0.4 0.2 0.0 PRO 0.2 1.0 1.5 11.6 4.5 2.1 RA7 -- -- -- -- -- -- RAV -- -- 2.3 -- -- 0.0 SCN 1.8 3.8 2.5 1.4 0.2 0.0 TSC 1.5 1.8 4.2 3.8 2.5 1.4 VET 0.1 1.5 2.1 3.8 3.6 0.1 VSP 2.5 14.0 -- 7.4 6.6 -- -------------------------------------------------------------- More Explanation of the different columns (see 1-8 at tables 1+3): 9) The fields "Unconsistent Identification" measures the relative amount (%) of those viruses where different names were assigned to the same virus. This is, to some extent, a measure of how precise the identification capacity of the resp. scanner is; optimum measure is 0%. 10) The fields "Unreliable Detection" measures the relative amount (%) of viruses which were only partly detected. Definition of unreliable detection is that at least one sample of the virus *is* detected and at least one sample of the virus is *not* detected. In some sense, unreliable detection is more dangerous than those cases when a scanner misses the virus completely, because an unreliably detected virus may be a hidden source of continuous viral infections. Remark: in comparison with previous VTC tests, we have refrained from reporting other features such as "unreliable identifications" and "multiple reports", to reduce information overload. ------------------------ Overview Table A5: --------------------- Table A5 indicates whether some AntiVirus DOS-product also detects non-viral malware, esp. including virus generators, trojans and intended (though not self-replicating) viruses. Results only apply to Macro Malware where VTCs "List of Known Macro Malware" displays the actual status of all known malicious threats. For detailed results see "6bdosfil.txt" and "6ddosmac.txt". Table A5: DOS Detection Rate of Non-Viral File+Macro Malware ============================================================ Scanner File Malware Macro Malware --------------------------------------- Testbed 2485 100.0 142 100.0 --------------------------------------- AVA 1653 66.5 130 91.5 AVG 1622 65.3 98 69.0 AVK 2356 94.8 136 95.8 AVP 2194 88.3 136 95.8 DRW 1855 74.6 116 81.7 DSS 2424 97.5 140 98.6 FPR 2216 89.2 139 97.9 FSE 2204 88.7 136 95.8 HMV -- -- 137 96.5 INO 1870 75.3 136 95.8 IRS 1081 43.5 76 53.5 ITM 1113 44.8 39 27.5 NAV 1909 76.8 129 90.8 NOD 1575 63.4 137 96.5 NVC 1725 69.4 128 90.1 PAV 2356 94.8 134 94.4 PRO 299 12.0 84 59.2 RAV -- -- 116 81.7 SCN 2416 97.2 139 97.9 TSC 1421 57.2 105 73.9 VET 1088 43.8 127 89.4 VSP 1723 69.3 -- -- --------------------------------------- ------------------------ Overview Table AB: --------------------- Table AB indicates, for those scanners (most actual versions each time) in last two tests (1998/10 when detection under W-98 was first tested, and 1999-03), how the results of file and macro virus detection rates under Windows 98 developped. (Concerning format of this table: see table AA). Table AB: Comparison: File/Macro Virus Detection Rate in last 2 VTC tests under Windows 98: =========================================================== File Virus Detection Macro Virus Detection SCAN 98/10 99/03 DELTA 98/10 99/03 DELTA NER % % % % % % ----------------------------------------------------------- ACU - - - - 97.6 - ANT 91.3 - - 84.3 - - ANY - - - 70.7 - - AVA 96.6 97.6 +1.0 96.7 95.9 -0.8 AVG - 87.3 - - 82.5 - AVK 99.6 90.8 -8.8 99.6 99.6 0.0 AVP 99.9 99.9 0.0 100.0 99.2 -0.8 AVX - 74.2 - - - - DSS 99.9 99.9 0.0 100.0 100.0 0.0 DRW - - - - 98.3 - DWW - 89.5 - - - - FPR/FMA - 93.9 - 92.4 99.8 +6.4 FSE 99.8 100.0 +0.2 100.0 100.0 0.0 FWN - - - 99.6 99.7 +0.1 HMV - - - - 99.5 - IBM 92.8 * * 94.5 * * INO 93.5 98.1 +4.6 88.1 99.8 +10.7 IRS 96.7 97.6 +0.9 99.0 99.5 +0.5 ITM - 64.2 - - 70.9 - IVB - - - 92.8 95.0 +2.2 NAV - 96.8 - 95.3 99.7 +2.4 NOD - 97.6 - - 99.8 - NVC 93.6 97.0 +3.4 - 99.1 - PAV 98.4 99.9 +1.5 99.5 99.5 0.0 PCC - 81.2 - - 98.0 - PRO - 37.3 - - 58.0 - RAV 84.9 - - 92.2 - - SCN 86.6 99.8 +13.2 97.7 100.0 +2.3 SWP 98.4 - - 98.6 - - TBA 92.6 * * 98.7 * * TSC - 55.3 - - 76.5 - VBS - - - 41.5 - - VBW - 26.5 - - 93.4 - VET - 66.3 - - 97.6 - VSP - 86.4 - - 0.4 - ----------------------------------------------------------- Mean 95.0% 84.2% +1.6% 92.1% 90.3% +1.8% ----------------------------------------------------------- ------------------------ Overview Table A6: --------------------- Table A6 summarizes results of actual tests under Windows 98. Tests were performed for detection of file and macro viruses. In addition, detection of macro-related malware was also tested. For detailed results see "6fw98.txt". Table A6: Detection Rate of File/Macro Viruses and Malware in Full and ITW Tests for Windows 98: ======================================================= Scanner File Viruses File Malware Macro Viruses Macro Malware ---------------------------------------------------------------------- Testbed 17148 100.0 2485 100.0 2874 100.0 2874 100.0 ---------------------------------------------------------------------- ACU 0 -- -- -- 2806 97.6 2806 97.6 AVA 16741 97.6 1659 66.8 2757 95.9 2757 95.9 AVG 14971 87.3 1637 65.9 2371 82.5 2371 82.5 AVK 15571 90.8 2356 94.8 2863 99.6 2863 99.6 AVP 17137 99.9 2359 94.9 2850 99.2 2850 99.2 AVX 12727 74.2 1460 58.8 -- -- -- -- DRW 15354 89.5 1177 47.4 2824 98.3 2824 98.3 DSS 17132 99.9 2426 97.6 2874 100.0 2874 100.0 DWW 16837 98.2 1849 74.4 2823 98.2 2823 98.2 FMA -- -- -- -- 2861 99.5 2861 99.5 FPR 16110 93.9 2216 89.2 2868 99.8 2868 99.8 FSE 17146 100.0 2468 99.3 2874 100.0 2874 100.0 FWN -- -- -- -- 2864 99.7 2864 99.7 HMV -- -- -- -- 2861 99.5 2861 99.5 INO 16821 98.1 2193 88.2 2867 99.8 2867 99.8 IRS 16734 97.6 2178 87.6 2860 99.5 2860 99.5 ITM 11011 64.2 1113 44.8 2039 70.9 2039 70.9 IVB -- -- -- -- 2729 95.0 2729 95.0 NAV 16600 96.8 2235 89.9 2865 99.7 2865 99.7 NOD 16738 97.6 1611 64.8 2869 99.8 2869 99.8 NVC 16636 97.0 1725 69.4 2849 99.1 2849 99.1 PAV 17137 99.9 2359 94.9 2859 99.5 2859 99.5 PCC 13924 81.2 1524 61.3 2817 98.0 2817 98.0 PRO 6398 37.3 291 11.7 1668 58.0 1668 58.0 SCN 17113 99.8 2417 97.3 2874 100.0 2874 100.0 TSC 9490 55.3 1421 57.2 2200 76.5 2200 76.5 VBW 4541 26.5 404 16.3 2683 93.4 2683 93.4 VET 11367 66.3 1090 43.9 2804 97.6 2804 97.6 VSP 14815 86.4 1885 75.9 12 0.4 12 0.4 ---------------------------------------------------------------------- ------------------------ Overview Table AC: --------------------- Table AC indicates, for those scanners (most actual versions each time) in last 5 tests (since 1997-07 when detection under W-NT was first tested and 1999-03), how the results of file and macro virus detection rates under Windows NT developped. (Concerning format of this table: see table AA). Table AC: Comparison: File/Macro Virus Detection Rate in last 4 VTC tests under Windows-NT: =========================================================== Scan ==== File Virus Detection ==== === Macro Virus Detection === ner 97/07 98/02 98/10 99/03 Delta 97/07 98/02 98/10 99/03 Delta ------------------------------------------------------------------- ANT 88.9 69.2 91.3 - - 92.2 - 85.7 - - ANY - - 69.7 - - - - 70.5 - - AVA - 97.4 96.6 97.1 +0.5 - 91.9 97.2 95.2 -2.0 AVG - - - 87.3 - - - - 82.5 - AVK - - 99.6 90.2 -9.4 - - 99.6 99.6 0.0 AVP - - 83.7 99.9 +16.2 - - 100.0 99.2 -0.8 AVX - - - 74.2 - - - - 98.9 - AW - 56.4 - - - - - 61.0 - - DRW - - - 93.3 - - - - 98.3 - DWW - - - - - - - - 98.2 - DSS 99.6 99.7 99.9 99.3 -0.6 99.0 100.0 100.0 100.0 0.0 FPR/FMA - 96.1 - 98.7 - - 99.9 99.8 99.8 0.0 FSE - 85.3 99.8 100.0 +0.2 - - 99.9 100.0 +0.1 FWN - - - - - - - 99.6 99.7 +0.1 HMV - - - - - - - 99.0 99.5 +0.5 IBM 95.2 95.2 77.2 * * 92.9 92.6 98.6 * * INO - 92.8 - 98.1 - - 89.7 - 99.8 - IRS - 96.3 - 97.6 - - 99.1 - 99.5 - IVB - - - - - - - 92.8 95.0 +2.2 NAV 86.5 97.1 - 98.0 - 95.6 98.7 99.9 99.7 -0.2 NOD - - - 97.6 - - - - 99.8 - NVC 89.6 93.8 93.6 96.4 +2.8 96.6 99.2 - 98.9 - PAV 97.7 98.7 98.4 97.2 -1.2 93.5 98.8 99.5 99.4 -0.1 PRO - - - 37.3 - - - - 58.0 - RAV - 81.6 84.9 85.5 +0.6 - 98.9 99.5 99.2 -0.3 RA7 - - - 89.3 - - - - 99.2 - PCC 63.1 - - - - - 94.8 - - - PER - - - - - - 91.0 - - - SCN 94.2 91.6 71.4 99.1 +27.7 97.6 99.1 97.7 100.0 +2.3 SWP 94.5 96.8 98.4 - - 89.1 98.4 97.5 - - TBA - 93.8 92.6 * * 96.1 - 98.7 * * TNT - - - * * - - 44.4 * * VET 64.9 - - 65.4 - - 94.0 - 94.9 - VSA - 56.7 - - - - 84.4 - - - VSP - - - 87.0 - - - - 86.7 - ------------------------------------------------------------------- Mean: 87.4% 88.1% 89.0% 89.2% +4.0% 94.7% 95.9% 91.6% 95.3% +0,1% ------------------------------------------------------------------- ------------------------ Overview Table A7: --------------------- Table A7 summarizes results of tests under Windows NT. Tests were performed for detection of file and macro viruses. In addition, detection of file and macro-related malware was also tested. Those scanners which were submitted as being identical for Windows 98 and Windows NT were only tested under Windows 98 (see table A6). For detailed results see "6gwnt.txt". Table A7: Detection Rate of File/Macro Viruses and Malware in Full and ITW Tests for Windows NT: ======================================================= Scanner File Viruses File Malware Macro Viruses Macro Malware ---------------------------------------------------------------------- Testbed 17148 100.0 2485 100.0 2874 100.0 142 100.0 ---------------------------------------------------------------------- AVA 16645 97.1 1647 66.3 2737 95.2 127 89.4 AVG 14972 87.3 1637 65.9 2371 82.5 98 69.0 AVK 15472 90.2 2356 94.8 2863 99.6 136 95.8 AVP 17137 99.9 2359 94.9 2850 99.2 130 91.5 AVX 12726 74.2 1460 58.8 2843 98.9 134 94.4 DRW 16005 93.3 1565 63.0 2825 98.3 116 81.7 DSS 17021 99.3 2426 97.6 2874 100.0 140 98.6 DWW 16837 98.2 1849 74.4 2823 98.2 115 81.0 FPR 16924 98.7 2216 89.2 2868 99.8 139 97.9 FSE 17146 100.0 2469 99.4 2874 100.0 140 98.6 FWN -- -- -- -- 2864 99.7 137 96.5 HMV -- -- -- -- 2860 99.5 137 96.5 INO 16817 98.1 2193 88.2 2867 99.8 136 95.8 IRS 16734 97.6 2178 87.6 2860 99.5 135 95.1 IVB -- -- -- -- 2729 95.0 118 83.1 NAV 16804 98.0 2235 89.9 2865 99.7 130 91.5 NOD 16738 97.6 1611 64.8 2869 99.8 137 96.5 NVC 16534 96.4 1724 69.4 2843 98.9 128 90.1 PAV 16663 97.2 2359 94.9 2857 99.4 134 94.4 PRO 6398 37.3 291 11.7 1668 58.0 38 26.8 RA7 15311 89.3 1402 56.4 2852 99.2 139 97.9 RAV 14660 85.5 1351 54.4 2850 99.2 138 97.2 SCN 16990 99.1 2402 96.7 2874 100.0 140 98.6 VET 11214 65.4 1023 41.2 2726 94.9 117 82.4 VSP 14913 87.0 1891 76.1 2493 86.7 142 100.0 ----------------------------------------------------------------------