===================================================== File 6ASUMOV.TXT: Overview: Results of VTC Test 1999-10 (October 1998): ===================================================== Formatted with non-proportional font (Courier) 1) General: =========== The test presented here works on the shoulder of previous VTC tests performed by Vesselin Bontchev (his last test "1994-07" is available, for comparison, from a parallel entry in our ftp site) and regular VTC tests published since 1997 (07), and it is an upgrade of VTCs last test "1998-02" published in March 1998. Fro details see VTCs homepage: http://agn-www.informatik.uni-hamburg.de/vtc As in test "1998-02", VTC tested on-demand AntiVirus products under DOS for their ability to detect boot, file and macro viruses in the resp. virus databases. For these tests, virus databases were signi- ficantly updated. Moreover, file and macro virus detection (both in full databases as in "In-the-Wild" testbeds) was tested also for scanner versions running under Windows 95, Windows 98 and Windows NT. Further to macro and file malware tests, this test included also determining the quality of detection of viruses in objects packed with four popular scanners, and it analysed the ability of AV products to avoid "false positive" alarms. The protocols produced by each scanner were analysed for indications how many viruses and infected objects were detected, and whether identification is consistent (that is: producing the same name for all infected files, images and documents, respectively) and reliable (detecting all infected files, images and documents). The file, boot and macro viruses were frozen on their status as known and available to VTC on April 30, 1998. For a detailed index of the respective virus databases, see: A3TSTBED.ZIP). Following discussions about virality esp. of all samples in boot and file virus databases, there was a careful cleaning process where all samples of potentially questionable virality were moved to a special database from where self-reproduction experiments were started. We also received some information from experts of tested products which lead us to move few more samples to this "doubtful" database for further inspection. No questions or doubts were raised about the macro virus testbed. Consequently, testbeds used for final evaluation have less file viruses and infected objects compared to the last test. It is VTCs policy to move samples of questionable virality or malicity back to the related testbeds when we can prove definitively that they are viral or otherwise malicious. After a series of pre-tests during which test procedures were installed, final tests were performed on available products received before mid-July 1998. Results were only included for the most actual scanner version each. 2) Special experiences in test "1998-10": ========================================= Original plans were to publish the test report end-July 1998. This goal could not be achieved, and the test was delayed, due to several unforeseen problems: Problem #1: During preparation of and test runs using the very large file virus database, it was observed that system support of managing files and directories produced unforeseen results. When moving larger portions of files and directories (esp. with Explorer under Windows-NT), files were partly not moved and even partly lost. During tests, it was observed (esp. when scanners ran in Win-NT`s DOS box) that not all directories had been scanned. During tests, it became evident that NTFS (and also FAT) didnot work as reliably as assumed. Similar observations have also been reported by other experts. This problem could only be overcome by extreme care and work upon comparing which files/directories had been touched or which had been suitably processed by each scanner. This unforseen workload delayed test reporting more than ***two months***. After detailed quality analysis, we are now confident that test results are of "good quality". Problem #2: Following Murphy`s law, we were also victims of some power failure produced in our neighbour laboratory (used by students of Theoretical Informatics) whose coffee machine produced statistical power outages, the cause of which was detected only after two weeks of intermittent outages. Due to these failures, we lost more than two weeks of test work (thanx to our faculty, we thereafter were the first to get a power failsafe device :-) Problem #3: Just a "normal" problem is that a growing number of AV products does not behave "well" (conforming with VTC test requirements) on our growing databases. Some products even stopped scanning for unforeseen reasons. Concerning growing test time, our world record holder is now a product which - although otherwise well- behaving - needed 11 days including 10 nights to perform the file virus test! Problem #4: Our time schedule was in-line with time schedules of our student testers, concerning work and vacation- ing during semester holidays (mid-July until mid- October). Non-availability of testers and evaluation specialists contributed further to the delay. Consequently, data were ready for quality assessment and report generation only end-October, and the report was - as good and fast as possible during times of lectures, seminars and examinations - produced for publication end-November 1998. Nevertheless, these experience have taught us even more about "unreliable" systems :-) Consequently, we have changed our schedules and procedures to permit more flexibility in cases of unforeseen (although not TOO long) problems. Related procedures are applied even in the "next" test (VTC "1999-02") which has been started on-schedule with freezing testbeds on their status as of November 1998-10. 3) Results and evaluation: ========================== For details of results on scanner tests under DOS, Windows 95, Windows 98 and Windows NT, see test documents 6B-6F. For a detailed analysis of performances, see 7EVAL.TXT. All documents are in ASCII, formatted (and best reproducible) with non-proportiponal fonts (Courier). 4) Overview Tables: =================== In order to extract optimum information from the test results, comparative tables were produced for essential sets both for "full (zoo) test" as well as for the subset regarded equivalent to Joe Well`s and Wildlist Organisation`s "In-The-Wild" list. The result tables are given in ASCII, in a form from which an EXCEL or LOTUS spreadsheet can simply be derived, by simply deleting the headline and importing the "pure" table into a related spread- sheet. VTC deliberately did NOT follow suggestions to (optionally) present the results in XLS form, to avoid ANY possible macro-viral side effect (e.g. when some mirror site inadvertantly implants an XLS virus during its preprocessing, e.g. when adding some information on the mirror site). In order to determine whether and to what degree AntiVirus products also help users to identify non-replicating malicious software such as trojan horses, virus generators and intended (though not replicating) viruses, a special test was performed to detect known non-viral malware related to macro and file viruses. Macro malware was selected as published in VTCs monthly "List of Known Macro Viruses". For this test, known network malware (e.g. worms, hostile applets and malicious ControlX) were deliberately excluded (but will be incorporated into a future malware test). As consciousness of several AV experts concerning related threats to their customers is not sufficiently well developped, we accepted (so far, and ending with this test) not to publish malware detection results for those products where their producers requested abstention (related entries are marked with "*" in respective tables). Detailed results are collected in separate files: Operating system = DOS: 6bdosfil.txt: Detection Rates of File Viruses (full/ITW) and File Malware, as well as packed objects 6cdosboo.txt: Detection Rates of Boot Viruses (full/ITW) 6ddosmac.txt: Detection Rates of Macro Viruses (full/ITW) and Macro Malware, as well as packed objects Operating system = Windows 95: 6ewin95.txt: Detection Rates of File/Macro Viruses (full/ITW) Operating system = Windows NT: 6fwinnt.txt: Detection Rates of File/Macro Viruses (full/ITW) This text contains the following tables to give an overview of important VTC test "1998-10" results: Table A1: Detection Rate of File/Boot/Macro Viruses in Full Test (DOS) Table AA: Comparison: File/Macro Virus Detection Rate in last 4 VTC tests (DOS) Table A2: Detection Rate of Boo7/File/Macro Viruses in In-The-Wild Test (DOS) Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents) Table A4: Consistency/Reliability of Virus Detection in Full Test (DOS) Table A5: Detection Rate of Non-Viral File+Macro Malware Table A6: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows 95) Table A7: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows NT) Much more information is available from detailed tables, including detection of virus related to goat files, images and documents (see related chapters, as referred to in "1CONTENT.TXT"). ------------------------ Overview Table A1: ----------------------- This tables contains detection rates for file and macro viruses For all products tested, including those where several subsequent versions were received during test period. Table A1: Detection Rate of File/Boot/Macro Viruses in Full DOS Test: ===================================================================== Scanner Boot Viruses File Viruses Macro Viruses ------------------------------------------------------------------ Total: 881 100.0 14596 100.0 2159 100.0 ------------------------------------------------------------------ ANT 558 63.3 10589 75.7 1221 56.6 AVG 714 81.0 12258 87.6 1762 81.6 AVK 877 99.5 12590 90.0 2153 99.7 AVP 880 99.9 13944 99.7 2159 100.0 AVS 865 98.2 13696 97.9 2099 97.2 DRW 806 91.5 13032 93.1 2143 99.3 DSS 880 99.9 13985 99.9 2159 100.0 FPR 790 89.7 13361 95.5 2154 99.8 FSE 880 99.9 13945 99.7 1946 90.1 FWN - - - - 1851 85.7 HMV - - - - 2137 99.0 INO 817 92.7 13081 93.5 2056 95.2 ITM 666 75.6 9213 65.8 1647 76.3 NAV - - 13723 98.1 2155 99.8 NVC 826 93.8 13130 93.8 1961 90.8 RAV 511 58.0 9931 71.0 2149 99.5 SCN 830 94.2 12288 87.8 2129 98.6 SWP 859 97.5 13770 98.4 2129 98.6 TBA 764 86.7 13039 93.2 2132 98.7 TSC - - 7854 56.1 367 17.0 VBS - - 4967 35.5 - - VSP 442 50.2 10646 76.1 - - VET - - - - 2105 97.5 ------------------------------------------------------------------ Explanation of the different columns: 1) "Scanner Codename" is the code name of the scanner as listed in file A2SCANLS.TXT. 2) "Number of File Viruses (%)" is the number of different file infecting *viruses* in the virus collection used during the tests, which have been detected by the particular scanner. Their percentage from the full set of viruses in the collection used for tests is given in parenthesis. We define two viruses as being different if they differ in at least one bit in their non-modifiable parts. For the variably encrypted viruses, the virus body has to be decrypted before the comparison is to be performed. For the polymorphic viruses, additionally the part of the virus which is modified during the replication process has to be ignored. 3) Number of Boot Viruses (%)" as in (2) but for boot/DBR infectors. 4) "Number of Macro Viruses (%)" is the number of different macro *viruses* from the collection used for the test that the scanner detects. This field is analogous to field 2, only it lists macro viruses, not file infecting viruses. ------------------------ Overview Table AA: --------------------- This table indicates, for those scanners (most actual versions each time) in last tests (1997-02/07 versus 1998-02), how the results of file and macro virus detection rates developped. Results of these tests are given, and difference DELTA between last 2 tests is calcu- lated. A "+"-sign indicates that the resp. scanner improved in related category, whereas a "-"-sign indicates that the present result is not as good as the previous one. Results of +-0.5% are regarded as "statistical" as this may depend upon differences in signature updates. In some cases, comparison is impossible due to problems in previous or present test. Table AA: Comparison: File/Macro Virus Detection Rate in last 4 VTC tests (DOS): ===================================================== ---- File Virus Detection --- ---- Macro Virus Detection --- SCAN 97/02 97/07 98/02 98/10 DELTA 97/02 97/07 98/02 98/10 DELTA NER % % % % % % % % % % ----------------------------------------------------------------- ALE 98.8 94.1 89.4 - - 96.5 66.0 49.8 - - AVS 98.9 97.4 97.4 97.9 +0.5 99.3 98.2 80.4 97.2 +16.8 AVG 79.2 85.3 84.9 87.6 +2.7 25.2 71.0 27.1 81.6 +54.6 AVK - - - 90.0 - - - - 99.7 - AVP 98.5 98.4 99.3 99.7 +0.4 99.3 99.0 99.9 100.0 +0.1 ANT 73.4 80.6 84.6 75.7 -8.3 58.0 68.6 80.4 56.6 -23,8 DRW 93.2 93.8 92.8 93.1 +0.3 90.2 98.1 94.3 99.3 +5.0 DSS 99.7 99.6 99.9 99.9 0.0 97.9 98.9 100.0 100.0 +0.0 FMA - - - - - 98.6 98.2 99.9 - - FPR 90.7 89.0 96.0 95.5 -0.5 43.4 36.1 99.9 99.8 -0.1 FSE - - 99.4 99.7 - - - 99.9 90.1 -9.8 FWN - - - - - 97.2 96.4 91.0 85.7 -5.3 IBM 93.6 95.2 96.5 - - 65.0 88.8 99.6 - - INO - - 92.0 93.5 +1.5 - - 90.3 95.2 +4.9 IRS - 81.4 74.2 - - - 69.5 48.2 -22.3 ITM - 81.0 81.2 65.8 -15.4 81.8 58.2 68.6 76.3 +7.7 IVB 8.3 - - - - - - - - HMV - - - - - - - 98.2 99.0 +0.8 NAV 66.9 67.1 97.1 98.1 +1.0 80.7 86.4 98.7 99.8 +1.1 NVC 87.4 89.7 94.1 93.8 -0.3 13.3 96.6 99.2 90.8 -8.4 PAN - - 67.8 - - - - 73.0 - - PAV - 96.6 98.8 - - - 93.7 100.0 - - PCC - - - - - - 67.6 - - - PCV 67.9 - - - - - - - - - RAV - - - 71.0 - - - - 99.5 - SCN 83.9 93.5 90.7 87.8 -0.9 95.1 97.6 99.0 98.6 -1.4 SWP 95.9 94.5 96.8 98.4 +1.6 87.4 89.1 98.4 98.6 -0.2 TBA 95.5 93.7 92.1 93.2 +1.1 72.0 96.1 99.5 98.7 -1.2 TSC - - 50.4 56.1 +5.7 - - 81.9 17.0 -64.9 TNT 58.0 - - - - - - - - - VDS - 44.0 37.1 - - 16.1 9.9 8.7 - - VET - 64.9 - - - - 94.0 97.3 97.5 +0.2 VRX - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - VHU 19.3 - - - - - - - - - VSA - - 56.9 - - - - 80.6 - - VSP - - - 76.1 - - - - - - VSW - - 56.9 - - - - 83.0 - - VTR 45.5 - - - - 6.3 - - - - XSC 59.5 - - - - - - - - - ------------------------------------------------------------------ Mean 74.2 84.8 84.4 85.4 69.6 80.9 83.8 89.6 ------------------------------------------------------------------ Explanation of new column: 5) "Change" is the relative differens between results of test "1997-07" and "1998-02". ------------------------ Overview Table A2: --------------------- This table indicates how many viruses belonging to the "In-The-Wild" subset of the full virus databases have been found by the respective scanner. Optimum measure is 100%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A2: Detection Rate of Boot/File/Macro Viruses in DOS-ITW Test: ==================================================================== Scanner Number Boot Number File Number Macro Codename: Viruses ITW(%) Viruses ITW(%) Viruses ITW (%) --------------------------------------------------------------------- Total: 207 100.0% 122 100.0% 75 100.0% --------------------------------------------------------------------- ANT 173 83.6 114 93.4 54 72.0 AVG 204 98.6 121 99.2 75 100.0 AVK 205 99.0 122 100.0 75 100.0 AVP 206 99.5 122 100.0 75 100.0 AVS 206 99.5 121 99.2 75 100.0 DRW 204 98.6 121 99.2 75 100.0 DSS 207 100.0 122 100.0 75 100.0 FPR 201 97.1 121 99.2 75 100.0 FSE 206 99.5 122 100.0 75 100.0 FWN - - - - 65 86.7 HMV - - - - 75 100.0 INO 205 99.0 121 99.2 75 100.0 ITM 190 91.8 110 90.2 73 97.3 NAV - - 119 97.5 75 100.0 NVC 203 98.1 122 100.0 75 100.0 RAV 177 85.5 119 97.5 75 100.0 SCN 206 99.5 120 98.4 75 100.0 SWP 205 99.0 122 100.0 75 100.0 TBA 201 97.1 121 99.2 75 100.0 TSC - - 99 81.1 68 90.7 VBS - - 103 84.4 - - VSP 129 62.3 98 80.3 - - VET - - - - 75 100.0 ------------------------------------------------------------------ ------------------------ Overview Table A3: --------------------- This table indicates how many infected objects (files, boot/MBR images, Word and EXCEL documents) have been found by the respective scanner in the full database. Optimum measure is 100%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents): ================================================ -------- Number of objects infected: ------------- Scanner with Boot with File with Macro Codename Viruses (%) Viruses (%) Viruses (%) ------------------------------------------------------------------- Total: 4806 100.0% 112036 100.0% 9033 100.0% ------------------------------------------------------------------- ANT 2908 60.5 - - 4969 55.0 AVG 3607 75.1 100833 90.0 7423 82.2 AVK 4792 99.7 101025 90.2 9019 99.8 AVP 4806 100.0 111702 99.7 9029 100.0 AVS 4664 97.0 109836 98.0 8870 98.2 DRW 4385 91.2 105701 94.3 8991 99.5 DSS 4805 100.0 111860 99.8 9033 100.0 FPR 4385 91.2 109663 97.9 9021 99.9 FSE 4806 100.0 111701 99.7 8272 91.6 FWN - - - - 8044 89.1 HMV - - - - 8972 99.3 INO 4304 89.6 106375 94.9 8584 95.0 ITM 2783 57.9 69667 62.2 6655 73.7 NAV - - 110232 98.4 9019 99.8 NVC 4332 90.1 106710 95.2 8378 92.7 RAV 2873 59.8 78022 69.6 8980 99.4 SCN 4322 89.9 101246 90.4 8954 99.1 SWP 4591 95.5 110576 98.7 8971 99.3 TBA 3590 74.7 105661 94.3 8968 99.3 TSC - - 58446 52.2 1430 15.8 VBS - - 41819 37.3 - - VSP 1898 39.5 77095 68.8 8902 98.5 ------------------------------------------------------------------- Explanation of the different columns (see also 1-5 at Table 1/2): 6) "Number (%) of objects infected with file viruses" is the number of *files* infected with file-infecting viruses from the test set, which are detected by that particular scanner as being infected. Percentage of those files from the full set of files is given in parenthesis. We often have more than one infected file per virus, but not all viruses are represented by the same number of files, so this number does not give a good impression of the real detection rate of the scanner. It is included here only for completeness. Of course, it still *does* provide some information - usually the better a scanner is, the more files it will detect as infected. 7) "Number (%) of objects infected with boot viruses" is the number of infected boot sectors in the test set that the scanner detects as infected. This field is analogous to field 5, though it lists infected boot sectors, not files. 8) "Number of objects infected with macro viruses" is the number of infected documents in the test set that the scanner detects as infected. This field is analogous to field 5, though it lists infected documents, not files. ------------------------ Overview Table A4: --------------------- This table provides information about the "quality" of detection. Inconsistent or unreliable detection means that some virus is identified with different names in different objects belonging to the same virus. Unreliable detection means that some virus is identified at least once, though not in all objects infected with the related virus. Optimum measures both for inconsistency and unreliability are 0%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A4: Consistency/Reliability of Virus Detection in Full DOS Test: ====================================================================== Scanner Unreliable Identification: Unreliable Detection: Codename: Boot(%) File(%) Macro(%) Boot(%) File(%) Macro(%) ------------------------------------------------------------------ ANT 1.5 4.0 1.5 8.4 2.0 0.2 AVG 3.5 2.5 1.7 17.1 2.0 1.0 AVK 9.0 3.1 2.2 0.7 0.2 0.0 AVP 9.0 3.3 2.1 0.1 0.1 0.2 AVS 5.4 4.5 0.7 0.9 0.7 0.2 DRW 3.0 3.0 2.3 4.1 1.7 0.0 DSS 0.2 2.7 0.5 0.0 0.0 0.0 FPR 9.5 1.5 0.2 4.5 0.8 0.1 FSE 9.0 3.3 1.9 0.1 0.1 0.1 FWN - - 0.8 - - 0.1 HMV - - 0.4 - - 0.2 INO 10.6 3.3 1.4 4.9 1.1 0.2 ITM 0.0 2.5 9.6 40.1 3.8 2.5 NAV - 12.0 1.5 - 1.1 0.0 NVC 3.1 3.4 1.0 2.2 2.4 0.4 RAV 3.2 6.9 0.4 7.3 3.4 0.7 SCN 7.0 6.1 1.3 3.6 2.7 0.0 SWP 7.3 5.1 1.2 1.9 0.9 0.1 TBA 1.7 2.5 1.0 28.7 3.3 0.3 TSC - 2.3 0.8 - 3.4 0.6 VBS - 0.9 - - 4.2 - VSP 5.8 31.5 - 15.4 5.4 - VET - - 2.5 - - 0.0 ------------------------------------------------------------------ More Explanation of the different columns (see 1-8 at tables 1+3): 9) The fields "Unconsistent Identification" measures the relative amount (%) of those viruses where different names were assigned to the same virus. This is, to some extent, a measure of how precise the identification capacity of the resp. scanner is; optimum measure is 0%. 10) The fields "Unreliable Detection" measures the relative amount (%) of viruses which were only partly detected. Definition of unreliable detection is that at least one sample of the virus *is* detected and at least one sample of the virus is *not* detected. In some sense, unreliable detection is more dangerous than those cases when a scanner misses the virus completely, because an unreliably detected virus may be a hidden source of continuous viral infections. Remark: in comparison with previous VTC tests, we have refrained from reporting other features such as "unreliable identifications" and "multiple reports", to reduce information overload. ------------------------ Overview Table A5: --------------------- This table indicates whether some AntiVirus DOS-product also detects non-viral malware, esp. including virus generators, trojans and intended (though not self-replicating) viruses. Results only apply to Macro Malware where VTCs "List of Known Macro Malware" displays the actual status of all known malicious threats. For detailed results see "6bdosfil.txt" and "6ddosmac.txt". Table A5: Detection Rate of Non-Viral File+Macro Malware ======================================================== Scanner Number FileMalware Number MacroMalware Codename: Found (%) Found (%) ---------------------------------------------------- Testbed 3321 100.0% 111 100.0% ---------------------------------------------------- ANT 1751 52.7 26 23.4 AVG 2366 71.2 77 69.4 AVK 3109 93.6 107 96.4 AVP 3137 94.5 107 96.4 AVS 2665 80.2 107 96.4 DRW 2639 79.5 101 91.0 DSS 3269 98.4 111 100.0 FPR **** **** *** **** FSE 3138 94.5 107 96.4 FWN - - 81 73.0 HMV - - 99 89.2 INO 2880 86.7 104 93.7 ITM 1552 46.7 35 31.5 NAV **** **** *** **** NVC 2414 72.7 98 88.3 RAV 2116 63.7 107 96.4 SCN **** **** *** **** SWP **** **** **** **** TBA 2052 61.8 102 91.9 TSC 679 20.4 96 86.5 vbs 792 23.8 - - VSP 2220 66.8 - - VET - - 92 82.9 ---------------------------------------------------- Remark: Scanners which were not tested for malware detection following a request of their producers are marked "*". ------------------------ Overview Table A6: --------------------- Under Windows 95, several 32 bit scanners were tested. Tests were performed for detection of file and macro viruses. In addition, detection of macro-related malware was also tested. For detailed results see "6ew95.txt". Table A6: Detection Rate of File/Macro Viruses and Malware in Full and ITW Tests for Windows 95: ======================================================= Scanner Number File Number File Number of Macro Number Macro Codename: Viruses (%) Malware (%) Viruses (%) Malware (%) -------------------------------------------------------------------- Total: 13993 100.0% 3321 100.0% 2159 100.0% 191 100.0% -------------------------------------------------------------------- ANT 12778 91.3 2869 86.4 1850 85.7 171 89.5 ANY --- - 1851 55.7 1527 70.7 105 55.0 AVK 13940 99.6 3093 93.1 2151 99.6 186 97.4 AVP 13981 99.9 3137 94.5 2159 100.0 186 97.4 AVS 13516 96.6 2622 79.0 2088 96.7 180 94.2 DSS 13986 99.9 3255 98.0 2159 100.0 191 100.0 FPR/FMA ---- - **** * 1995 92.4 *** * FSE ---- - --- - 2159 100.0 191 100.0 FWN ---- - --- - --- - 185 96.9 IBM 12984 92.8 **** * 2041 94.5 *** * INO 12925 92.4 2579 77.7 1901 88.1 180 94.2 IRS 13528 96.7 2968 89.4 2137 99.0 184 96.3 IVB ---- - --- - 2004 92.8 164 85.9 NAV ---- - --- - 2058 95.3 *** * NVC 13103 93.6 2406 72.4 --- - 171 89.5 PAV 13766 98.4 3042 91.6 2148 99.5 186 97.4 RAV 11883 84.9 941 28.3 301 13.9 56 29.3 SCN 12286 87.8 **** * 2129 98.6 *** * SWP 13766 98.4 **** * 2129 98.6 *** * TBA 12957 92.6 2443 73.6 2132 98.7 177 92.7 VBS ---- - 576 17.3 1914 88.7 143 74.9 ------------------------------------------------------------------- Remark: Scanners which were not tested for malware detection following a request of their producers are marked "*". ------------------------ Overview Table A7: --------------------- Under Windows NT, a short list of scanners was tested. Tests were performed for detection of file and macro viruses. In addition, detection of macro-related malware was also tested. Those scanners which were submitted as being identical for Windows 95 and Windows NT were only tested under Windows 95 (see table 6). For detailed results see "6fwnt.txt". Table A7: Detection Rate of File/Macro Viruses and Malware in Full and ITW Tests for Windows NT: ======================================================= Scanner Number File Number File Number of Macro Number Macro Codename: Viruses (%) Malware (%) Viruses (%) Malware (%) ------------------------------------------------------------------- Total: 13993 100.0% 3321 100.0% 2159 100.0% 191 100.0% ------------------------------------------------------------------- ACU ---- - --- - --- - 108 97.3 ANT 12778 91.3 2869 86.4 1850 85.7 97 87.4 ANY 9753 69.7 359 10.8 1522 70.5 51 45.9 AVK 13932 99.6 3093 93.1 2151 99.6 107 96.4 AVP 11714 83.7 3137 94.5 2159 100.0 107 96.4 AVS 13516 96.6 2672 80.5 2099 97.2 104 93.7 DSS 13985 99.9 3267 98.4 2159 100.0 111 100.0 FPR/FMA ---- - **** **** 2154 99.8 *** **** FSE 13964 99.8 3105 93.5 --- - 111 100.0 FWN ---- - --- - 2151 99.6 107 96.4 HMV ---- - --- - 2137 99.0 99 89.2 IBM 10800 77.2 **** **** 2129 98.6 *** **** IVB ---- - --- - 2004 92.8 95 85.6 NAV ---- - **** **** 2157 99.9 *** **** NVC 13101 93.6 --- - --- - 96 86.5 PAV 13766 98.4 3030 91.2 2148 99.5 107 96.4 RAV 11883 84.9 2116 63.7 2149 99.5 107 96.4 SCN 9992 71.4 **** **** 2109 97.7 *** **** SWP 13771 98.4 **** **** 2105 97.5 *** **** TBA 12957 92.6 2443 73.6 2132 98.7 102 91.9 TNT ---- - --- - 958 44.4 73 65.8 ------------------------------------------------------------------- Remark: Scanners which were not tested for malware detection following a request of their producers are marked "*".