======================================================= File 6ASUMOV.TXT: Overview: Results of VTC Test 1999-09 (September 1999): ======================================================= Formatted with non-proportional font (Courier) Content of this file: ===================== 1) General/Background 2) Special problems/experiences in test "1999-09" 3) Results and evaluation 4) Overview Tables Table A0: Products/Versions in VTC test "1999-09" Table A1: Detection Rate of File/Boot/Macro Viruses in Full Test (DOS) Table AA: Comparison: File/Macro Virus Detection Rate in last 5 VTC tests (DOS) Table A2: Detection Rate of Boot/File/Macro Viruses in In-The-Wild Test (DOS) Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents) Table A4: Consistency/Reliability of Virus Detection in Full Test (DOS) Table A5: Detection Rate of Non-Viral File+Macro Malware Table AB: Comparison: File/Macro Virus Detection Rate in last 2 VTC tests under Windows 98 Table A6: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows 98) Table AC: Comparison: File/Macro Virus Detection Rate in last 4 VTC tests under Windows NT Table A7: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows NT) 1) General/Background: ====================== The test presented here works on the shoulder of previous VTC tests performed by Vesselin Bontchev (his last test "1994-07" is available, for comparison, from another entry in VTCs ftp site) and regular VTC tests published since 1997-07, and it is an upgrade of VTCs last test "1999-03" published in March 1999. For details of previous tests, see VTCs homepage: http://agn-www.informatik.uni-hamburg.de/vtc Concerning operating platforms, tests are performed for AntiVirus (AntiMalware) products running under DOS, Windows 98 and Windows-NT, respectively. As in previous tests, VTC tested on-demand AntiVirus products under DOS for their ability to detect boot, file and macro viruses in the resp. virus databases. File and macro virus detection (both in full databases as in "In-the-Wild" testbeds) was tested for scanner versions running under Windows 98 and Windows NT. For these tests, virus databases were signifi- cantly updated against last tests. A separate test was performed on multiple generations of 6 polymorphic file viruses; this is a subset generated by VTCs dynamic generation procedure for detection of polymorphic viruses. Another separate test is based on file viruses generated by the VKIT file virus generator. Further to macro and file malware tests, this test included also determining the quality of detection of viruses in objects packed with four popular packers, and it analysed the ability of AV products to avoid "false positive" alarms. The protocols produced by each scanner were analysed for indications how many viruses and infected objects were detected, and whether identifi- cation is consistent (that is: producing the same name for all infected files, images and documents, respectively) and reliable (detecting all infected files, images and documents). Databases of file, boot and macro viruses were frozen on their status as known and available to VTC on March 31, 1999. For a detailed index of the respective virus databases, see: A3TSTBED.ZIP. Following discussions about virality of all samples in boot and file virus databases, there was a careful cleaning process where all samples of potentially questionable virality were moved to a special database from where self-reproduction experiments were started. We also received some information from experts of tested products which lead us to move few more samples to this "doubtful" database for further inspection. No questions or doubts were raised about the macro virus testbed. It is VTCs policy to move samples of questionable virality or malicity back to the related testbeds when we can prove definitively that they are viral or otherwise malicious. After a series of pre-tests during which test procedures were installed (esp. for products with new engines and those participating for the first time), final tests were performed on available products received before May 16, 1999. Results were included for the most actual scanner and signature version each. The following products participated in VTC test "1999-09": Table A0: Products/Versions in VTC test "1999-09": ================================================== ANT/AN5 = H+B EDV AntiVir (CLI/GUI) for DOS, W-98, W-NT AVA = AVAST v7.70 (28) May 99 (CD/diskettes) for DOS, W3.x AVAST32 v3.0 (132) May 99 (CD/diskettes) for W-98, W-NT AVK = AVK 8.07 (CD) for DOS, W-98, W-NT AVP = AVP 3033 (email) for DOS, W-98, W-NT AVG = AVG 5.0 (CD/diskettes) for DOS, W-98, W-NT AVX = AVX Version 4.1 (ftp) for DOS, W-98, W-NT CMD = Command Software AV (CD) for DOS, W-98, W-NT DRW = DrWeb 4.10 (email) for DOS, W-98, W-NT DWW = DrWeb for Win32 4.10 (email) for DOS, W-98, W-NT FPR = FProt 3.05 (CD/diskettes) for DOS, W-98, W-NT FPW = FProt FP-WIN 3.05 (CD) for DOS, W-98, W-NT FSE = FSAV 4.03a (CD) for DOS, W-98, W-NT FWN = FWin32 v. 1.82f (email) for W-NT INO = Inoculan 4.50/4.21 5.14.99 (CD/diskettes) for DOS, W-98, W-NT MR2 = MR2S v.098 Gegamarx+SWE (email) for DOS, W-98, W-NT NAV = NAV 3.03/Sig May 1999 (CD/diskettes) for DOS, W-98, W-NT NOD = NOD32 v. 3.17 (email) for DOS, W-98, W-NT NVC = NVC (GUI/CLI) 4.70 (CD) for DOS, W-98, W-NT PAV = PAV 99.01/PAV 32 May 99 (CD) for DOS, W-98, W-NT PRO = Protector 6.6.A01 (ftp) for W-3, W-98, W-NT RAV = RAV 7 v. 1.00b (CD) for DOS, W-98, W-NT SCN = NAI VirusScan 4.0.2 (CD) for DOS, W-98, W-NT SWE = Sweep v. 3.21 (http) for DOS, W-98, W-NT TSC = TScan 1.81 (http) for DOS, W-98, W-NT VSP = VSP 11.75.01 (email) for DOS, W-98, W-NT ------------------------------------------------------------------- Those tables which summarize developments of detection rates over several VTC tests comorise some products which were either no longer available (*) or which were not submitted or available for this test (-). For details of related AV products (which are always adressed with the same scanner code), see related test reports (available from VTCs www/ftp site). 2) Special problems/experiences in test "1999-09": ================================================== Again, the complexity of the test and problems with scanners and systems delayed publication for about 6 weeks. Indeed, several products needed "special care" (aka "spoon-feeding") as they either crashed or refused scanning several directories, for unknown reasons. Partially, such problems may be related to the MS-NTFS problem as described in last test report ("1999-03", file "3asumov.txt": Problem #1). While VTC testbeds are different from user data collections which should usually have at most very few viruses, it is our sincere hope that such products behave more "user-friendly" in "normal environments". Concerning other impacts on timeliness, VTCs tests have to fix into university schedules as well as personal schedules of students who essentially run these tests. A special impediment during summer 1999 were Y2k-related student activities which somewhat reduced the priority of VTC test procedures. But apart from those problems documented in "8problms.txt", test procedures work with sufficient stability. 3) Results and evaluation: ========================== For details of results on scanner tests under DOS, Windows 98 and Windows NT, see test documents 6B-6F and 6H. For a detailed analysis of performances, see 7EVAL.TXT. All documents are in ASCII, formatted (and best reproducible) with non-proportiponal fonts (Courier), 72 columns. 4) Overview Tables: =================== In order to extract optimum information from the test results, comparative tables were produced for essential sets both for "full (zoo) test" as well as for the subset regarded equivalent to Joe Well`s and Wildlist Organisation`s "In-The-Wild" list. Result tables are given in ASCII, in a form from which an EXCEL or LOTUS spreadsheet can simply be derived, by simply deleting the headline and importing the "pure" table into a related spreadsheet. VTC deliberately did NOT follow suggestions to (optionally) present the results in XLS form, to avoid ANY possible macro-viral side effect (e.g. when some mirror site inadvertantly implants an XLS virus during itīs pre- processing, e.g. when adding some information on the mirror site). In order to determine whether and to what degree AntiVirus products also help users to identify non-replicating malicious software such as trojan horses, virus generators and intended (though not replicating) viruses, a special test was performed to detect known non-viral malware related to macro and file viruses. Macro malware was selected as published in VTCs monthly "List of Known Macro Viruses". For this test, known network malware (e.g. worms, hostile applets and malicious ControlX) were deliberately excluded, but will be incorporated into a future malware test). Different from previous tests (where we accepted not to publish malware detection results for those products where their producers requested abstention), related malware tests are from now on a mandatory part of VTC tests. We regret that we cannot admit AV products whose developpers are evidently not willing to protect their customers from such threats, or who forbid that a related capability of their product be tested by an independent institution. Fortunately, all relevant AV producers agreed with related VTC test conditions. Detailed results are collected in separate files: Operating system = DOS: ----------------------- 6bdosfil.txt: Detection Rates of File Viruses (full/ITW) and File Malware, as well as packed objects 6cdosboo.txt: Detection Rates of Boot Viruses (full/ITW) 6ddosmac.txt: Detection Rates of Macro Viruses (full/ITW) and Macro Malware, as well as packed objects Operating system = Windows 98: ------------------------------ 6ewin98.txt: Detection Rates of File/Macro Viruses (full/ITW) Operating system = Windows NT: ------------------------------ 6fwinnt.txt: Detection Rates of File/Macro Viruses (full/ITW) This text contains the following tables to give an overview of important VTC test "1999-09" results: Table A1: Detection Rate of File/Boot/Macro Viruses in Full Test (DOS) Table AA: Comparison: File/Macro Virus Detection Rate in last 6 VTC tests (DOS) Table A2: Detection Rate of Boot/File/Macro Viruses in In-The-Wild Test (DOS) Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents) Table A4: Consistency/Reliability of Virus Detection in Full Test (DOS) Table A5: Detection Rate of Non-Viral File+Macro Malware Table AB: Comparison: File/Macro Virus Detection Rate in last 3 VTC tests under Windows 98 Table A6: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows 98) Table AC: Comparison: File/Macro Virus Detection Rate in last 5 VTC tests under Windows NT Table A7: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows NT) Much more information is available from detailed tables, including detection of virus related to goat files, images and documents (see related chapters, as referred to in "1CONTENT.TXT"). ------------------------ Overview Table A1: ----------------------- Table A1 contains detection rates for boot, file and macro viruses under platform DOS for all products tested, including those where several subsequent versions were received during test period. Table A1: Detection Rate of File/Boot/Macro Viruses in Full DOS Test: ===================================================================== Scanner Boot Viruses File Viruses Macro Viruses ----------------------------------------------------- Testbed 1237 100.0% 17561 100.0% 3546 100.0% ----------------------------------------------------- AVA 1210 97.8 17109 97.4 3355 94.6 AVG 914 73.9 15206 86.6 3425 96.6 AVP 1236 99.9 17522 99.8 3546 100.0 CMD 1145 92.6 17276 98.4 3530 99.5 DRW 1203 97.3 17262 98.3 FPR 1204 97.3 17426 99.2 3535 99.7 FSE 1233 99.7 17437 99.3 3460 97.6 INO 1164 94.1 16631 94.7 3529 99.5 MR2 885 71.5 11489 65.4 2469 69.6 NAV 1192 96.4 16856 96.0 3495 98.6 NOD 1218 98.5 17024 96.9 3546 100.0 NVC 1200 97.0 3531 99.6 PAV 1219 98.5 17356 98.8 3502 98.8 PRO 794 22.4 SCN 1236 99.9 17044 97.1 3546 100.0 SWP 1226 99.1 17386 99.0 3491 98.4 TSC 680 55.0 9069 51.6 2469 69.6 VSP 882 71.3 13976 79.6 5 0.1 ------------------------------------------------------ Mean value: 90.6% 91.1% 90.3% (*) ------------------------------------------------------ Explanation of the different columns: 1) "Scanner Codename" is the code name of the scanner as listed in file A2SCANLS.TXT (see also Table A0). 2) "Number of File Viruses (%)" is the number of different file infecting *viruses* in the virus collection used during the tests, which have been detected by the particular scanner. Their percentage from the full set of viruses in the collection used for tests is given in parenthesis. We define two viruses as being different if they differ in at least one bit in their non-modifiable parts. For variably encrypted viruses, the virus body has to be decrypted before the comparison is to be performed. For polymorphic viruses, additionally the part of the virus which is modified during the replication process has to be ignored. 3) Number of Boot Viruses (%)" as in (2) but for boot/DBR infectors. 4) "Number of Macro Viruses (%)" is the number of different macro *viruses* from the collection used for the test that the scanner detects. This field is analogous to field 2, only it lists macro viruses, not file infecting viruses. (*) Remark: concerning mean value of macro virus detection: for fair basis of comparison, the unusually low detection value (0.1%) of VSP was not counted (mean value including VSP: 85.0%). ------------------------ Overview Table AA: --------------------- Table AA indicates, for those scanners (most actual versions each time) in last tests (1997-02/07 versus 1998-10), how the results of file and macro virus detection rates developped. Results of these tests are given, and difference DELTA between last 2 tests is calculated. A "+"-sign indicates that the resp. scanner improved in related category, whereas a "-"-sign indicates that the present result is not as good as the previous one. Results of +-0.5% are regarded as "statistical" as this may depend upon differences in signature updates. In some cases, comparison is impossible due to problems in previous or present test. Table AA: Comparison: File/Macro Virus Detection Rate in last 6 VTC tests under DOS: ===================================================== ------- File Virus Detection ------- ------ Macro Virus Detection ------- SCAN 9702 9707 9802 9810 9903 9909 DELTA 9702 9707 9802 9810 9903 9909 DELTA NER % % % % % % % % % % % % % % ------------------------------------------------------------------------------ ALE 98.8 94.1 89.4 - - - - 96.5 66.0 49.8 - - - - AVA 98.9 97.4 97.4 97.9 97.6 97.4 -0.2 99.3 98.2 80.4 97.2 95.9 94.6 -1.3 AVG 79.2 85.3 84.9 87.6 87.1 86.6 -0.5 25.2 71.0 27.1 81.6 82.5 96.6 14.1 AVK - - - 90.0 75.0 - - - - 99.7 99.6 - - AVP 98.5 98.4 99.3 99.7 99.7 99.8 0.1 99.3 99.0 99.9 100% 99.8 100% 0.2 ANT 73.4 80.6 84.6 75.7 - - - 58.0 68.6 80.4 56.6 - - - DRW 93.2 93.8 92.8 93.1 98.2 98.3 0.1 90.2 98.1 94.3 99.3 98.3 - - DSS 99.7 99.6 99.9 99.9 99.8 * * 97.9 98.9 100% 100% 100% * * FMA - - - - - - - 98.6 98.2 99.9 - - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 0.5 43.4 36.1 99.9 99.8 99.8 99.7 -0.1 FSE - - 99.4 99.7 97.6 99.3 1.7 - - 99.9 90.1 99.6 97.6 -2.0 FWN - - - - - - 97.2 96.4 91.0 85.7 - - - HMV - - - - - - - - 98.2 99.0 99.5 - - IBM 93.6 95.2 96.5 - * * * 65.0 88.8 99.6 - * * * INO - - 92.0 93.5 98.1 94.7 -3.4 - - 90.3 95.2 99.8 99.5 -0.3 IRS - 81.4 74.2 - 51.6 - - - 69.5 48.2 - 89.1 - - ITM - 81.0 81.2 65.8 64.2 - - 81.8 58.2 68.6 76.3 - - IVB 8.3 - - - 96.9 - - - - - - - - - MR2 - - - - - 65.4 - - - - - - 69.6 - NAV 66.9 67.1 97.1 98.1 77.2 96.0 18.8 80.7 86.4 98.7 99.8 99.7 98.6 -0.9 NOD - - - 96.9 - 96.9 - - - - - 99.8 100% 0.2 NVC 87.4 89.7 94.1 93.8 97.6 - - 13.3 96.6 99.2 90.8 - 99.6 - PAN - - 67.8 - - - - - - 73.0 - - - - PAV - 96.6 98.8 - 73.7 98.8 25.1 - - 93.7 100% 99.5 98.8 -0.7 PCC - - - - - - - - 67.6 - - - - - PCV 67.9 - - - - - - - - - - - - - PRO - - - - 35.5 - - - - - - 81.5 - - RAV - - - 71.0 - - - - - - 99.5 99.2 -0.3 - SCN 83.9 93.5 90.7 87.8 99.8 97.1 -2.7 95.1 97.6 99.0 98.6 100% 100% 0.0 SWP 95.9 94.5 96.8 98.4 - 99.0 - 87.4 89.1 98.4 98.6 - 98.4 - TBA 95.5 93.7 92.1 93.2 * * * 72.0 96.1 99.5 98.7 * * * TSC - - 50.4 56.1 39.5 51.6 12.1 - 81.9 17.0 76.5 +59.5 69.6 10.1 TNT 58.0 - - - * * * - - - - * * * VDS - 44.0 37.1 - - - - 16.1 9.9 8.7 - - - - VET - 64.9 - - 65.3 * * - 94.0 97.3 97.5 97.6 * * VRX - - - - - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - - - - - VHU 19.3 - - - - - - - - - - - - - VSA - - 56.9 - - - - - - 80.6 - - - - VSP - - - 76.1 71.7 79.6 7.9 - - - - - 0.1 - VSW - - 56.9 - - - - - - 83.0 - - - - VTR 45.5 - - - - - - 6.3 - - - - - - XSC 59.5 - - - - - - - - - - - - - ------------------------------------------------------------------------------- Mean 74.2 84.8 84.4 85.4 81.2 90.6 +5.0% 69.6 80.9 83.8 89.6 93.6 88.2% 1.7% (*) ------------------------------------------------------------------------------- Explanation of new column: 5) DELTA (=Change) is the relative difference between results of test "1999-03" and "1999-09". (*) Remark: concerning mean value of macro virus detection: for fair basis of comparison, the unusually low detection value (0.1%) of VSP was not counted (mean value including VSP: 82.0%). ------------------------ Overview Table A2: --------------------- Table A2 indicates how many viruses belonging to the "In-The-Wild" subset of the full virus databases have been found by the respective scanner. Optimum measure is 100%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A2: Detection Rate of Boot/File/Macro Viruses in DOS-ITW Test: ==================================================================== Scanner Boot Viruses File Viruses Macro Viruses ----------------------------------------------------- Testbed 42 100.0% 46 100.0% 59 100.0% ----------------------------------------------------- AVA 42 100.0 46 100.0 59 100.0 AVG 39 92.9 46 100.0 59 100.0 AVP 42 100.0 46 100.0 59 100.0 CMD 42 100.0 46 100.0 59 100.0 DRW 42 100.0 46 100.0 FPR 42 100.0 46 100.0 59 100.0 FSE 42 100.0 46 100.0 59 100.0 INO 42 100.0 44 95.7 59 100.0 MR2 39 92.9 40 87.0 39 66.1 NAV 42 100.0 46 100.0 59 100.0 NOD 42 100.0 46 100.0 59 100.0 NVC 42 100.0 59 100.0 PAV 42 100.0 46 100.0 59 100.0 PRO 25 42.4 SCN 42 100.0 46 100.0 59 100.0 SWP 42 100.0 46 100.0 59 100.0 TSC 39 92.9 40 87.0 39 66.1 VSP 39 92.9 36 78.3 0 0.1 ------------------------------------------------------ Mean Value: 98,3% 96,8% 92,2% (*) ------------------------------------------------------ (*) Remark: concerning mean value of macro virus detection: for fair basis of comparison, the unusually low detection value (0.0%) of VSP was not counted (mean value including VSP: 86.7%). ------------------------ Overview Table A3: --------------------- Table A3 indicates how many infected objects (files, boot/MBR images, Word and EXCEL documents) have been found by the respective scanner in the full database. Optimum measure is 100%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents): ================================================ -------- Number of objects infected: ---- Scanner Boot Viruses File Viruses Macro Viruses ------------------------------------------------------- Testbed 5286 100.0% 132576 100.0% 9731 100.0% ------------------------------------------------------- AVA 5127 97.0 129644 97.8 9267 95.2 AVG 3627 68.6 117828 88.9 9392 96.5 AVP 5280 99.9 132518 100.0 9728 100.0 CMD 4984 94.3 131490 99.2 9680 99.5 DRW 5157 97.6 130778 98.6 FPR 5213 98.6 132180 99.7 9696 99.6 FSE 5272 99.7 132028 99.6 9443 97.0 INO 4947 93.6 124956 94.3 9689 99.6 MR2 3484 65.9 80811 61.0 6823 70.1 NAV 4741 89.7 127710 96.3 9570 98.3 NOD 5247 99.3 129433 97.6 9721 99.9 NVC 5190 98.2 9658 99.2 PAV 5250 99.3 132118 99.7 9586 98.5 PRO 2475 25.4 SCN 5285 100.0 127871 96.5 9730 100.0 SWP 5231 99.0 131675 99.3 9622 98.9 TSC 2826 53.5 62267 47.0 6823 70.1 VSP 3356 63.5 95727 72.2 5 0.1 ------------------------------------------------------- Mean value: 89,3% 90,5% 90,5% (*) ------------------------------------------------------- Explanation of the different columns (see also 1-5 at Table 1/2): 6) "Number (%) of objects infected with file viruses" is the number of *files* infected with file-infecting viruses from the test set, which are detected by that particular scanner as being infected. Percentage of those files from the full set of files is given in parenthesis. We often have more than one infected file per virus, but not all viruses are represented by the same number of files, so this number does not give a good impression of the real detection rate of the scanner. It is included here only for completeness. Of course, it still *does* provide some information - usually the better a scanner is, the more files it will detect as infected. 7) "Number (%) of objects infected with boot viruses" is the number of infected boot sectors in the test set that the scanner detects as infected. This field is analogous to field 5, though it lists infected boot sectors, not files. 8) "Number of objects infected with macro viruses" is the number of infected documents in the test set that the scanner detects as infected. This field is analogous to field 5, though it lists infected documents, not files. (*) Remark: concerning mean value of macro virus detection: for fair basis of comparison, the unusually low detection value (0.1%) of VSP was not counted (mean value including VSP: 85.2%). ------------------------ Overview Table A4: --------------------- Table A4 provides information about the "quality" of detection. Inconsistent or unreliable detection means that some virus is identified with different names in different objects belonging to the same virus. Unreliable detection means that some virus is identified at least once, though not in all objects infected with the related virus. Optimum measures both for inconsistency and unreliability are 0%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A4: Consistency/Reliability of Virus Detection in Full DOS Test: ====================================================================== Scanner Unreliable Identification: Unreliable Detection: Boot(%) File(%) Macro(%) Boot(%) File(%) Macro(%) --------------------------------------------------------------- AVA 3.1 4.4 0.8 1.0 0.8 0.3 AVG 1.1 2.9 0.5 17.1 2.1 0.2 AVP 4.3 2.5 1.7 0.3 0.0 0.0 CMD 0.9 0.3 0.1 1.5 0.2 0.1 DRW 2.1 3.0 1.1 1.1 FPR 0.2 0.1 0.1 0.5 0.1 0.1 FSE 4.2 2.6 1.8 0.5 0.1 0.0 INO 7.8 3.5 1.5 3.8 0.8 0.1 MR2 8.6 13.2 4.1 5.9 4.6 1.4 NAV 31.5 0.0 0.0 7.5 2.5 0.2 NOD 7.8 11.3 0.7 0.2 1.4 0.0 NVC 2.3 1.0 1.3 0.3 PAV 4.1 2.4 1.8 0.2 0.1 0.0 PRO 0.5 0.8 SCN 2.3 3.0 0.5 0.0 0.0 0.0 SWP 3.8 4.9 0.7 0.8 0.8 0.1 TSC 1.7 2.0 4.3 2.8 3.3 1.4 VSP 6.1 16.2 0.0 11.0 5.7 0.1 -------------------------------------------------------------- Mean value: 5,4% 4,5% 1.2% 3,3% 1.5% 0.3% -------------------------------------------------------------- More Explanation of the different columns (see 1-8 at tables 1+3): 9) The fields "Unconsistent Identification" measures the relative amount (%) of those viruses where different names were assigned to the same virus. This is, to some extent, a measure of how precise the identification capacity of the resp. scanner is; optimum measure is 0%. 10) The fields "Unreliable Detection" measures the relative amount (%) of viruses which were only partly detected. Definition of unreliable detection is that at least one sample of the virus *is* detected and at least one sample of the virus is *not* detected. In some sense, unreliable detection is more dangerous than those cases when a scanner misses the virus completely, because an unreliably detected virus may be a hidden source of continuous viral infections. ------------------------ Overview Table A5: --------------------- Table A5 indicates whether some AntiVirus DOS-product also detects non-viral malware, esp. including virus generators, trojans and intended (though not self-replicating) viruses. Results only apply to Macro Malware where VTCs "List of Known Macro Malware" displays the actual status of all known malicious threats. For detailed results see "6bdosfil.txt" and "6ddosmac.txt". Table A5: DOS Detection Rate of Non-Viral File+Macro Malware ============================================================ Scanner File Malware Macro Malware --------------------------------------- Testbed 3691 100.0 167 100.0 --------------------------------------- AVA 2237 60.6 143 85.6 AVG 2105 57.0 137 82.0 AVP 3169 85.9 166 99.4 CMD 3089 83.7 165 98.8 DRW 141 84.4 FPR 3131 84.8 165 98.8 FSE 3112 84.3 158 94.6 INO 3036 82.3 161 96.4 MR2 1604 43.5 161 96.4 NAV 2310 62.6 157 94.0 NOD 2386 64.6 167 100.0 NVC 152 91.0 PAV 3119 84.5 161 96.4 PRO 29 17.4 SCN 3547 96.1 167 100.0 SWP 2833 76.8 157 94.0 TSC 1535 41.6 112 67.1 VSP 2145 58.1 1 0.6 --------------------------------------- Mean value: 71,1% 88,0% --------------------------------------- (*) Remark: concerning mean value of macro virus detection: for fair basis of comparison, the unusually low detection value (0.6%) of VSP was not counted (mean value including VSP: 83,1%). ------------------------ Overview Table AB: --------------------- Table AB indicates, for those scanners (most actual versions each time) in last three tests (1998/10 when detection under Windows 98 was first tested, 1999-03 and 1999-09), how the results of file and macro virus detection rates under Windows 98 developped. (Concerning format of this table: see table AA). Table AB: Comparison: Zoo File/Macro Virus Detection Rate in last 3 VTC tests under Windows 98: ============================================================ File Virus Detection Macro Virus Detection SCAN 98/10 99/03 99/09 DELTA 98/10 99/03 99/09 DELTA NER % % % % % % % % ------------------------------------------------------------ ACU - - - - - 97.6 - - AN5 - - 87.2 - - - 89.3 - ANT 91.3 - 86.5 - 84.3 - 89.5 - ANY - - - - 70.7 - - - AVA 96.6 97.6 97.2 -0.4 96.7 95.9 93.9 -2.0 AVG - 87.3 87.0 -0.3 - 82.5 96.6 14.1 AVK 99.6 90.8 99.8 9.0 99.6 99.6 100.0 0.4 AVP 99.9 99.9 99.8 -0.1 100.0 99.2 100.0 0.8 AVX - 74.2 75.7 1.5 - - 98.7 - CMD - - 98.4 - - - 99.6 - DSS 99.9 99.9 * * 100.0 100.0 * * DWW - 89.5 98.3 8.8 - 98.3 98.8 0.5 FPR/FMA - 93.9 99.4 0.5 92.4 99.8 99.7 -0.1 FPW - - 99.2 - - - 99.9 - FSE 99.8 100.0 99.9 -0.1 100.0 100.0 100.0 0.0 FWN - - - - 99.6 99.7 99.9 0.2 HMV - - - - - 99.5 - - IBM 92.8 * * * 94.5 * * * INO 93.5 98.1 97.1 -1.0 88.1 99.8 98.1 -1.7 IRS 96.7 97.6 - - 99.0 99.5 - - ITM - 64.2 - - - - - - IVB - - - - 92.8 95.0 - - MR2 - - 65.9 - - - 64.9 - NAV - 96.8 97.6 0.8 95.3 99.7 98.7 -1.0 NOD - 97.6 98.3 0.7 - 99.8 100.0 0.2 NV5 - - 99.0 - - - 99.6 - NVC 93.6 97.0 99.0 1.3 - 99.1 99.6 - PAV 98.4 99.9 99.6 -0.3 99.5 99.5 86.7 -12.8 PCC - 81.2 - - - 98.0 - - PRO - 37.3 39.8 2.5 - 58.0 61.9 3.9 RAV 84.9 - 86.9 - 92.2 - 98.1 - SCN 86.6 99.8 99.7 -0.1 97.7 100.0 99.8 -0.2 SWP 98.4 - 99.0 - 98.6 - 98.5 - TBA 92.6 * * * 98.7 * * * TSC - 55.3 53.8 -1.5 - 76.5 64.9 -11.6 VBS - - - - 41.5 - - - VBW - 26.5 - - - 93.4 - - VET - 66.3 * * - 97.6 * * VSP - 86.4 79.7 -6.7 - 0.4 0.3 -0.1 ------------------------------------------------------------ Mean 95.0% 84.2% 89,7% 1,7% 92.1% 90.3% 93,5% -0,6% (*) ------------------------------------------------------------ (*) Remark: concerning mean value of macro virus detection: for fair basis of comparison, the unusually low detection value (0.3%) of VSP was not counted (mean value including VSP: 89,9%). ------------------------ Overview Table A6: --------------------- Table A6 summarizes results of actual tests under Windows 98. Tests were performed for detection of file and macro viruses. In addition, detection of macro-related malware was also tested. For detailed results see "6fw98.txt". Table A6: Detection Rate of File/Macro Viruses and Malware in Full and ITW Tests for Windows 98: ======================================================= Scanner File Viruses File Malware Macro Viruses Macro Malware ---------------------------------------------------------------------- Testbed 17561 100.0% 3691 100.0% 3546 100.0% 167 100.0% ---------------------------------------------------------------------- AN5 15310 87.2 2419 65.5 3166 89.3 141 84.4 ANT 15189 86.5 1914 51.9 3031 85.5 137 82.0 AVA 17078 97.2 2231 60.4 3328 93.9 143 85.6 AVG 15282 87.0 2123 57.5 3425 96.6 137 82.0 AVK 17520 99.8 3218 87.2 3546 100.0 166 99.4 AVP 17521 99.8 3221 87.3 3546 100.0 166 99.4 AVX 13295 75.7 2011 54.5 3499 98.7 159 95.2 CMD 17280 98.4 3085 83.6 3532 99.6 165 98.8 DWW 17261 98.3 2501 67.8 3502 98.8 141 84.4 FPR 17458 99.4 3182 86.2 3537 99.7 165 98.8 FPW 17423 99.2 3131 84.8 3537 99.7 165 98.8 FSE 17537 99.9 3533 95.7 3546 100.0 167 100.0 FWN 3543 99.9 160 95.8 INO 17051 97.1 2922 79.2 3478 98.1 152 91.0 MR2 11573 65.9 1774 48.1 2302 64.9 112 67.1 NAV 17131 97.6 3093 83.8 3501 98.7 157 94.0 NOD 17254 98.3 2434 65.9 3546 100.0 167 100.0 NV5 17389 99.0 2503 67.8 3531 99.6 152 91.0 NVC 17389 99.0 2503 67.8 3531 99.6 152 91.0 PAV 17492 99.6 3195 86.6 3504 86.7 163 97.6 PRO 6984 39.8 456 12.4 2196 61.9 48 28.7 RAV 1879 50.9 3478 98.1 161 96.4 SCN 17509 99.7 3534 95.7 3540 99.8 167 100.0 SWP 17386 99.0 2852 77.3 3494 98.5 94 56.3 TSC 9445 53.8 1156 31.3 2302 64.9 112 67.1 VSP 13999 79.7 2161 58.5 11 0.3 2 1.2 ---------------------------------------------------------------------- Mean value: 89.9% 68.3% 93.3% (*) 87.4% (*) ---------------------------------------------------------------------- (*) Remark: concerning mean value of macro virus/malware detection: for fair basis of comparison, the unusually low detection value (0.3% and 1.2% respectively) of VSP was not counted (mean value including VSP: macro virus detection: 89.7%; macro malware detection 84,1%). ------------------------ Overview Table AC: --------------------- Table AC indicates, for those scanners (most actual versions each time) in last 5 tests (since 1997-07 when detection under W-NT was first tested until 1999-09), how the results of file and macro virus detection rates under Windows NT developped. (Concerning format of this table: see table AA). Table AC: Comparison: File/Macro Virus Detection Rate in last 5 VTC tests under Windows-NT: =========================================================== Scan ==== File Virus Detection ==== === Macro Virus Detection ===== ner 9707 9802 9810 9903 9909 Delta 9707 9802 9810 9903 9909 Delta --------------------------------------------------------------------- ANT 88.9 69.2 91.3 - 87.2 - 92.2 - 85.7 - 89.3 - ANY - - 69.7 - - - - - 70.5 - - - AVA - 97.4 96.6 97.1 97.4 0.3 - 91.9 97.2 95.2 93.3 -1.9 AVG - - - 87.3 87.0 -0.3 - - - 82.5 96.6 14.1 AVK - - 99.6 90.2 99.8 8.6 - - 99.6 99.6 100% 0.4 AVP - - 83.7 99.9 99.8 -0.1 - - 100% 99.2 100% 0.8 AVX - - - 74.2 75.2 1.0 - - - 98.9 98.7 -0.2 AW - 56.4 - - - - - 61.0 - - - - DRW/DWW - - 93.3 98.3 5.0 - - - 98.3 98.8 0.5 DSS 99.6 99.7 99.9 99.3 * * 99.0 100% 100% 100% * * FPR/FMA - 96.1 - 98.7 99.4 0.7 - 99.9 99.8 99.8 99.7 -0.1 FPW - - - - - - - - - - 99.7 - FSE - 85.3 99.8 100% 99.9 -0.1 - - 99.9 100% 100% 0.0 FWN - - - - - - - - 99.6 99.7 - - HMV - - - - - - - - 99.0 99.5 - - IBM 95.2 95.2 77.2 * * * 92.9 92.6 98.6 * * * INO - 92.8 - 98.1 98.0 -0.1 - 89.7 - 99.8 99.7 -0.1 IRS - 96.3 - 97.6 - - - 99.1 - 99.5 - - IVB - - - - - - - - 92.8 95.0 - - MR2 - - - - 61.9 - - - - - 69.6 - NAV 86.5 97.1 - 98.0 97.6 - 95.6 98.7 99.9 99.7 98.7 -1.0 NOD - - - 97.6 98.2 0.6 - - - 99.8 100% 0.2 NVC 89.6 93.8 93.6 96.4 - - 96.6 99.2 - 98.9 98.9 0.0 NVN - - - - 99.0 - - - - - 99.5 - PAV 97.7 98.7 98.4 97.2 99.6 2.4 93.5 98.8 99.5 99.4 99.7 0.3 PRO - - - 37.3 42.4 5.1 - - - 58.0 61.9 3.9 RAV - 81.6 84.9 85.5 - - - 98.9 99.5 99.2 - - RA7 - - - 89.3 - - - - - 99.2 - - PCC 63.1 - - - - - - 94.8 - - - - PER - - - - - - - 91.0 - - - - SCN 94.2 91.6 71.4 99.1 99.8 0.7 97.6 99.1 97.7 100% 100% 0.0 SWP 94.5 96.8 98.4 - 99.0 - 89.1 98.4 97.5 - 98.4 - TBA - 93.8 92.6 * * * 96.1 - 98.7 * * * TNT - - - * * * - - 44.4 * * * VET 64.9 - - 65.4 * * - 94.0 - 94.9 * * VSA - 56.7 - - - - - 84.4 - - - - VSP - - - 87.0 69.8-17.2 - - - 86.7 0.3 -86.4 -------------------------------------------------------------------- Mean: 87.4 88.1 89.0 89.2 90,0% 0,3% 94.7 95.9 91.6 95.3 95,1% 1.1 (*) -------------------------------------------------------------------- (*) Remark: concerning mean value of macro virus detection: for fair basis of comparison, the unusually low detection value (0.3%) of VSP was not counted (mean value including VSP: 90.6%, deviation -4.4%). ------------------------ Overview Table A7: --------------------- Table A7 summarizes results of tests under Windows NT. Tests were performed for detection of file and macro viruses. In addition, detection of file and macro-related malware was also tested. Those scanners which were submitted as being identical for Windows 98 and Windows NT were only tested under Windows 98 (see table A6). For detailed results see "6gwnt.txt". Table A7: Detection Rate of File/Macro Viruses and Malware in Full and ITW Tests for Windows NT: ======================================================= Scanner File Viruses File Malware Macro Viruses Macro Malware ---------------------------------------------------------------------- Testbed 17561 100.0% 3691 100.0% 3546 100.0% 167 100.0% ---------------------------------------------------------------------- ANT 15310 87.2 2419 65.5 3166 89.3 141 84.4 AVA 17107 97.4 2235 60.6 3328 93.9 143 85.6 AVG 15283 87.0 2123 57.5 3425 96.6 137 82.0 AVK 17520 99.8 3218 87.2 3546 100.0 166 99.4 AVP 17521 99.8 3221 87.3 3546 100.0 166 99.4 AVX 13203 75.2 2011 54.5 3499 98.7 159 95.2 CMD 17280 98.4 3085 83.6 3532 99.6 165 98.8 DWW 17261 98.3 2501 67.8 3502 98.8 141 84.4 FPR 17458 99.4 3182 86.2 3537 99.7 165 98.8 FPW 17423 99.2 3131 84.8 3537 99.7 165 98.8 FSE 17537 99.9 3533 95.7 3546 100.0 167 100.0 FWN 3504 98.8 160 95.8 INO 17207 98.0 3036 82.3 3535 99.7 161 96.4 MR2 10877 61.9 2125 57.6 2468 69.6 112 67.1 NAV 17131 97.6 3093 83.8 3501 98.7 157 94.0 NOD 17250 98.2 2476 67.1 3546 100.0 167 100.0 NVC 3508 98.9 152 91.0 NVN 17389 99.0 2503 67.8 3529 99.5 152 91.0 PAV 17492 99.6 3195 86.6 3536 99.7 163 97.6 PRO 7440 42.4 468 12.7 2196 61.9 48 28.7 SCN 17526 99.8 3512 95.2 3546 100.0 166 99.4 SWP 17386 99.0 2852 77.3 3491 98.4 157 94.0 VSP 12250 69.8 2153 58.3 11 0.3 2 1.2 ---------------------------------------------------------------------- Mean value: 90,8% 72,4% 95,5% 90,1% (*) ---------------------------------------------------------------------- (*) Remark: concerning mean value of macro virus/malware detection: for fair basis of comparison, the unusually low detection value (0.3% and 1.2% respectively) of VSP was not counted (mean value including VSP: macro virus detection: 91.4%; macro malware detection 86.2%).