============================================================ File 0XECSUM.TXT: Released: September 24, 2000 Virus Test Center (VTC), University of Hamburg AntiMalware Product Test "2000-08" ============================================================ [Formatted with non-proportional font (Courier), 72 columns] Content of this text: ===================== 1. Background of this test; Malware Threats 2. Products included in VTC Test "2000-08" --------------------------------------------------------- 3. Summary #1.0: Macro Virus detection quality of DOS AV products on high level, HOWEVER script virus detection needs improvement! --------------------------------------------------------- #1.1: The number of products submitted for DOS tests is now down to 10; with proliferation of Windows platforms, DOS scanners become less relevant, and AV companies concentrate on other platforms. #1.2: Detection rates for zoo macro viruses have reached a very high level (mean: 98.6), with 4 products detecting "almost all" macro viruses: AVK, CMD, FPR, SCN; and 2 detect also "almost all" instantiations: CMD and SCN. #1.3: HOWEVER: detection rates of script viruses (even a small collection of VBS, mIRC and JavaScript viruses) is FAR FROM acceptable, with mean detection rate as low as 66.4%: NO products is "perfect" or even "very good" (>95%) 3 products are "good": CMD (93,5), AVK (91,5), FPR (90.5) As related threats grow significantly, much more work must be invested. ------------------------------------------------------------------ 4. Summary #2.0: ITW macro detection rates under DOS rather perfect! ------------------------------------------------------- #2.1: 7 (of 10) products detect all ITW macro viruses in all instantiations (samples): AVK, CMD, DRW, FPR, INO, VNC and SCN #2.2: AV companies seem to loose interest in DOS AV products. ------------------------------------------------------------- 5. Summary #3.0: Macro Virus detection rates under W-98 on high level BUT Script Virus detection rates insufficient! ----------------------------------------------------- #3.1: Detection rates for macro viruses for scanners under Windows 98 is rather stable on a fairly high, though not perfect level: Perfect scanner (100%): 4 (last test: 3) CMD, DSE, PFW and FSE. Almost perfect scanners : 4 AVK, AVP, PAV and SCN. Excellent scanners (>99%): 3 (last test: 7) NVC, INO and AVX. #3.2: HOWEVER: Simlilar to DOS (see #1.3), detection rates of script viruses for W-98 are FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 3 products are "very good" (Y95%): FSE,DSE,SCN 4 products are "good" (>90%): CMD,AVK,FPW,PAV #3.3: AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-98 users! ------------------------------------------------------------ 6. Summary #4.0: Macro virus detection rates under W-NT on high level, HOWEVER Script Virus detection rates insufficient. ---------------------------------------------------- #4.1: Detection rates for macro viruses for scanners under Windows NT is rather stable on a fairly high, with 4 products on "perfect" level: Perfect scanner (100%): 4 (last test: 1) CMD, FPW, FSE, SCN. Almost perfect scanners : 3 AVK, AVP, PAV. Excellent scanners (>99%): 3 (last test: 8) NVC, INO, AVX. #4.2: HOWEVER: similar to DOS (see #1.3) and W-98 (see #8.2), detection rates of script viruses for W-98 is FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 2 products are "very good": FSE, SCN 4 products are "good": CMD,AVK,FPW,PAV #4.3: AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-NT users! ------------------------------------------------------------- 7. Summary #5.0: Macro virus detection rates under W-2k on high level HOWEVER script virus detection needs significant improvement. ---------------------------------------------------- #5.1: Detection rates for macro viruses for scanners under Windows 2000 is starting on a high level, with 4 products on "perfect" level: Perfect scanner (100%): CMD, FPW, FSE, SCN. Almost perfect scanners : AVK, AVP, PAV. Excellent scanners (>99%): NVC, INO, AVX. #5.2: HOWEVER: similar to all other platforms, detection rates of script viruses for W-2k is FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 3 products are "very good": FSE, SCN, CMD 5 products are "good": AVK, AVP, FPW, PAV #5.3: AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-2k users! ------------------------------------------------------- 8. Summary #6.0: Detection rates of packed macro viral objects improving on all platforms! -------------------------------------------------------------- #6.1) Perfect scanners detect packed macro viruses both under DOS and all W-32 platforms for ALL 4 methods: AVK, CMD and SCN #6.2) Some perfect scanners detect packed macro viruses under all W-32 platforms for all 4 products: AVP, AVX and PAV #6.3) But there is still need for improvement. ------------------------------------------------------------- 9. Summary #7: Avoidance of False-Positive Alarms is improving: -------------------------------------------------------------- #7.1) FP-Avoiding "perfect" products, ALL platforms (3): AVK, INO, SCN #7.2) FP-avoiding perfect DOS scanners (4): ANT, AVK, INO, SCN #7.3) FP-avoiding perfect W-32 scanners (8): AVA, ANG, AVK, AVP, INO, PAV, PRO, SCN #7.4) There is still need for improvement in FP avoidance. -------------------------------------------------------------- 10. Summary #8: Macro Malware detection on ALL platforms improving: -------------------------------------------------------------- #8.1) "Perfect" AntiMalware products for DOS/W-32: CMD, FPR/FPW #8.2) "Excellent" AntiMalware products for DOS/W-32: FSE, SCN -------------------------------------------------------------- 11. Summary #9: Many W-32 scanners perform equally on W-32 platforms! ----------------------------------------------------- #10.1: The assumption that 32-bit engines in scanners produce the same detection rate for different instantiations of 32-bit operating systems (esp. for W-98, W-NT and W-2000) is now correct for the majority of scanners. #10.2: Analysis of ITW detection rates is NOT sufficient to determine the behaviour of 32-bit engines and does not guarantee equal detection rates for different W-32 platforms (esp. W-98/W-NT). -------------------------------------------------------- ================================================= 12. Conclusion: Searching for the "Perfect AV/AM product" #11.1) Best AV products in test: ************************************** "Perfect" AntVirus product: NONE -------------------------------------- "Excellent" AV products: 1st place: SCN 2nd place: AVK 3rd place: CMD 4th place: FPR/FPW 5th place: INO 6th place: AVP 7th place: FSE 8th place: NVC 9th place: AVX 10th place: DRW ************************************** "Perfect" AntiMalware product: NONE -------------------------------------- "Excellent" AntiMalware product: 1st place: SCN 2nd place: CMD 2nd place: FPR/FPW 4th place: FSE ************************************** 13. Availability of test results 14. Copyright, License, and Disclaimer Tables: ======= Table ES0: Development of viral/malware threats Table ES2: List of AV products in test 2000-08 Table ES3: Development of DOS scanners from 1997-02 to 2000-08 Table ESA: Development of W-98 scanners from 1997-07 to 2000-08 Table ES4: Development of W-NT scanners from 1997-07 to 2000-08 Table ES5: Performance of W-2k scanners test 2000-08 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) has many different faces. Presently, "computer viruses" are the dominant type of self-replicating malicious software (=self-replicating malware), which operate and spread on single (though possibly connected) platforms such as Intel/MicroSoft based systems, on which the vast majority of malware operates, presently with more than 50,000 instantiations. But few to several hundred "native" viruses also available for almost any other operating platform, esp. including Macintosh, UNIXes including Linux based systems, buand they have also been observed on platforms such as Java as well as handheld devices (e.g. Palm and PSION PDAs) and cellular phones with WML/WAP support. Viruses such as infamous Michelangelo (affecting DOS boot processes) and "Melissa" (a document infector) spread in principle on single systems, even if the are imported via media (e.g. diskettes, email attachment). With the growth of network connections and operations, "worms" as a form of malware genuinely self-replicating in networks beyond single systems grow significantly both in numbers and impact. Since last VTC test, the infamous "ILoveYou" worm has been widely reported as having "infected" an estimated 50 million PCs via their Internet connection. This worm carries a virus spreading locally on any connected PC, with the ability to further distribute the worm to other Internet connections. Within only 4 months, more than 50 "variants" of this worm were observed, some of which were eployed on a world-wide scale. Besides worms, other forms of malicious self-replicating or self-distributing forms nclude agents, hostile applets etc. Self-replicating malware may (or may not) contain a "payload", which may also be activated upon some trigger (or none), which will perform some effect (or "damage") to system, application, data or media. Pure (that is: not self- replicating) payload may reside, in the form of a "trojan horse" inside other software, which then executes its malicious functions upon triggering. Such a "payload" may also be a self-replicating malware which may again carry another payload. In this way, worms my carry a virus which again can deploy a worm into The Net, and which may damage specified data types (as observed with the infamous "ILoveYou worm" (aka VBS/LoveLetter.A). Other forms of non-self-replicating malware include "intended" (but not properly working) viruses, "germs" (=initial stages of viruses), backdoors etc. The development of malicious software can well be studied in view of the growth of VTC testbeds. The following table summarizes, for previous and actual VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== = File viruses= = Boot Viruses= =Macro Viruses= == Malware == =ScriptViruses= Test# Number Infected Number Infected Number Infected Number Malware Number Number Viruses objects viruses objects viruses objects file macro viruses objects -------------------------------------------------------------------------------------- 1997-07: 12,826 83,910 938 3,387 617 2,036 213 72 --- --- 1998-03: 14,596 106,470 1,071 4,464 1,548 4,436 323 459 --- --- 1998-10: 13,993 112,038 881 4,804 2,159 9,033 3,300 191 --- --- 1999-03: 17,148 128,534 1,197 4,746 2,875 7,765 3,853 200 --- --- + 5 146,640 (VKIT+4*Poly) 1999-09: 17,561 132,576 1,237 5,286 3,546 9,731 6,217 329 --- --- + 7 166,640 (VKit+6*Poly) 2000-04: 18,359 135,907 1,237 5,379 4,525 12,918 6,639 260 --- --- + 7 166,640 (VKit+6*Poly) 2000-08: --- --- --- --- 5,418 15,720 --- 500 306 527 -------------------------------------------------------------------------------------- Remark#1: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. Since test 1999-03, separate tests are performed to evaluate detection rates of VKIt-generated and selected polymorphic file viruses. Test 2000-08 deals with macro viruses/malware plus a selected database of script viruses, including VBS, mIRC and JavaScript viruses. Remark#2: With growing size of testbeds, rounding (esp. "near 100%) becomes important. As it seems inadequate to add more fraction digits, the concept of "almost 100%" (="rounded to 100% within given precision") is intruduced with test "2000-08" denotated as "100~" or simply "100", to distinguish between "exact" and "almost exact" identification. With annual deployment of more than 5,000 viruses and at least 1000 Trojan horses, many of which are available from Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate - where possible - such malicious software. Hence, the detection quality of AntiMalware esp. including AntiVirus products becomes an essential prerequisite of protecting customer productivity and data. and also Virus Test Center (VTC) at Hamburg University´s Faculty for Informatics performs regular tests of AntiMalware and esp. AntiVirus Software. VTC recently tested actual versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their status as of *** April 30, 2000 *** to give AV/AM producers a fair chance to support updates within the 8 weeks submission period. Scanners were requested for submission (or download) before June 1, 2000. The main test goal was to determine detections rates, reliability (=consistency) of macro virus identification and reliability of detection rates for submitted or publicly available scanners; for the first time, scanners were also tested against a selected subset of available non-macro script viruses, esp. including VBS, mIRC and JavaScript (JS) viruses. For both macro and script viruses, it was also tested whether viruses packed with 4 popular compressing methods (PKZIP, ARJ, LHA and RAR) would be detected (and to what degree) by scanners. Moreover, avoidance of False Positive alarms on "clean" (=non-viral and non-malicious) macro objects was also determined. Finally, a set of selected non-viral macro malware (droppers, Trojan horses, intended viruses etc) was used to determine whether and to what degree AntiVirus products may be used for protecting customers against Trojan horses and other forms of malware. VTC maintains, in close and secure connection with AV experts worldwide, collections of boot, file and macro viruses as well as related malware ("zoo") which have been reported to VTC or AV labs. Moreover, following the list of "In-The-Wild Viruses" (published on regular basis by Wildlist.org), a collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list doesnot report ITW Malware. 2. Products included in VTC "Test 2000-08" ========================================== For test "2000-08", the following *** 24 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviavion) under DOS, Windows-98, Windows-NT and - for the first time - Windows-2000 (in total: 79 versions) were tested: Table ES2: List of AV products in test "2000-08" ================================================ Abbreviation/Product/Version Tested under Platform ----------------------------------------------------------------------- ANT = H&B EDV AntiVir 6.2 (CD) for DOS, W-98, W-NT, W-2k ATD = AntiDote 1.50 (Vintage Sol) (http) for W-NT, W-2k AVA = AVAST/32 (http) for W-98, W-NT, W-2k AVG = (CD/snail mail) for W-98, W-NT, W-2k AVK = AVK 9 (June 5, 2000) (CD) for DOS, W-98, W-NT, W-2k AVP = AVP Platinum (email) for W-98, W-NT, W-2k AVX = AVX 5.5 (email) for W-98, W-NT, W-2k CLE = Cleaner 3.1 (http) for W-98 CMD = Command Software AV (FTP) for DOS, W-98, W-NT, W-2k (beta) DRW = DrWeb32 v.4.17 (FTP) for DOS, W-98, W-NT, W-2k DSE = Dr Solomon Emergency AV (updated sig) for W-98 FPR = FProt 3.06c (ftp) for DOS FPW = FProt FP-WIN 3.05 (ftp) for W-98, W-NT, W-2k FSE = FSAV (ftp) for DOS, W-98, W-NT, W-2k INO = Inoculan 4.53 (updated) (FTP) for DOS, W-98, W-NT, w-2k NAV = NAV (signature updates) (ftp) for DOS, W-98, W-NT, W-2k NVC = NVC 4.8 (http) for DOS, W-98, W-NT, W-2k PAV = PAV 2000 (CD) for DOS, W-98, W-NT, W-2k PER = PERuvian AV 2.3(NT),6.3(98) (http) for W-98, W-NT, W-2k PRO = Protector (http) for W-98, W-NT, W-2k QHL = QuickHeal 5.24 (http) for W-98, W-NT, W-2k RAV = RAV 7.6.02 (email) for W-98, W-NT, W-2k SCN = NAI VirusScan (email) for DOS, W-98, W-NT, W-2K UKV = Ultimate Killer Vaccine (email) for DOS ----------------------------------------------------------------------- Details of product (versions etc) are described in A2SCANLS.TXT. Problems observed during tests (which may have adversely influenced published results) are described in some detail in 8PROBLMS.TXT. The following products could not be tested: MKS (download problems) The following product were excluded from test: MR2S Reason: an author (who is a student at another German university, though without due reputation in technical matters related to reverse-engineering) of this product which performed less favourable" (less politely: miserable) in a previous VTC test accused VTC of "inadequate testing". His argument that his product was well able to detect viruses which were reported "missed" in the test (as documented in the test protocols published on VTC website) were unjustified as the product was even unable (probably due to a known bug in MS software which other AV products circumvent) to touch all samples in VTC testbed. The illicit accusations were presented in a group to which VTC has no access (nor wishes to), but were reported by trusworthy persons. VTC doesnot wish to invest any work or time into products whose supporters behave unethically (at least). Also excluded from VTC tests: Panda AntiVirus Reason: this product has been excluded from any VTC test when one of its senior managers requested to receive VTCs complete virus and malware databases as prerequisite for any further participation. It is VTCs standing practice to give essential samples missed by some product to the related AV producer, provided that a secure channel can be established; such exchange is guided by principles of need-to-know, technical expertise and and trustworthiness on the partner´s side. Also excluded from VTC tests (since "2000-08"): Sweep = Sophos AntiVirus Reason: this product has been excluded from VTC tests when we learned that Sophos exhibited undue pressure on another AV company to join in an activity named "Rapid Exchange of Virus Samples" (REVS). VTC never cooperates with any person or organisation which deliberately "exchanges" potentially harmful stuff, without due guarantees for the technical proficiency and ethical standards of related partners. VTC regards REVS as sufficiently similar in goal and observed practice to VX activities of virus authors and their adherents. VTC will only include Sophos AntiVirus when there is sufficient evidence that virus "exchange" is and will be no longer practiced. The following product were not submitted for this test but there was an indication that it will be submitted to a future test: PCCillin. In general, AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few scanners were not available either in general (TNT) or for this test, some of which were announced to participate in some future test. Finally, very few AV producers answered VTCs bids for submitting scanners with electronic silence. The following paragraphs summarizes those findings related to the detection of macro viruses and non-replicant malware. For details: see files 6DDOSPRE.TXT, 6FW98PRE.TXT, 6GWNTPRE.txt. 3. Summary #1: Development of DOS scanner detection rates: ========================================================== Concerning performance of DOS scanners, a comparison of virus detection results in tests from "1997-02" until present test "2000-08" shows how scanners behave and how manufacturers succeed in adapting their products to the growing threat of new viruses. The following table lists the development of detection rates of scanners (most actual versions in each test), and it calculates changes ("+" indicating improvement) in detection rates. For reasons of fairness, it must be noted that improvement of those products which have yet reached a very high level of detection and quality (say: more than 95%) is much more difficult to achieve than for those products which reached lower detection rates. Some products have incorporated new engines (esp. for 32-bit platforms) and included formerly separate scanners (e.g. on macro viruses) which lead to improved performance. Generally, changes in the order of about +-1.5% are less significant as this is about the growth rate of new viruses per month, so detection depends strongly upon whether some virus is reported (and analysed and included) just before a new update is delivered. Table ES3 lists developments for detection of file and macro viruses; moreover, results for detection of a small set of "script" viruses (including VBS, mIRC and JavaScript viruses) are added in a new column. For details of DOS macro and script virus detection, see result tables (6bDOS) and evaluation (7eval.txt). Table ES3: Improvement of DOS scanners from 1997-02 to 2000-08: =============================================================== Detection: Script --------- File Virus ------------- + ---------------- Macro Virus --------------- +Virus SCAN9702 9707 9802 9810 9903 9909 0004 I 9702 9707 9802 9810 9903 9909 0004 0007 Diff I 0007 NER % % % % % % % I % % % % % % % % % I % ---------------------------------------+----------------------------------------------+----- ALE 98.8 94.1 89.4 - - - - I 96.5 66.0 49.8 - - - - - - I - ANT 73.4 80.6 84.6 75.7 - - 92.8 I 58.0 68.6 80.4 56.6 - - 85.9 93.3 7.4 I 55.2 AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 I 99.3 98.2 80.4 97.2 95.9 94.6 93.7 - - I - AVG 79.2 85.3 84.9 87.6 87.1 86.6 - I 25.2 71.0 27.1 81.6 82.5 96.6 - - - I - AVK - - - 90.0 75.0 - - I - - - 99.7 99.6 - - 100~ - I 91.5 AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 I 99.3 99.0 99.9 100% 99.8 100% 99.9 - - I - CMD - - - - - - 99.5 I - - - - - 99.5 100% 100~ 0.0 I 93.5 DRW 93.2 93.8 92.8 93.1 98.2 98.3 - I 90.2 98.1 94.3 99.3 98.3 - 98.4 97.6 -0.8 I 60.8 DSE 99.7 99.6 99.9 99.9 99.8 - - I 97.9 98.9 100% 100% 100% - - - - I - FMA - - - - - - - I 98.6 98.2 99.9 - - - - - - I - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 I 43.4 36.1 99.9 99.8 99.8 99.7 100% 100~ 0.0 I 90.5 FSE - - 99.4 99.7 97.6 99.3 99.9 I - - 99.9 90.1 99.6 97.6 99.9 - - I - FWN - - - - - - I 97.2 96.4 91.0 85.7 - - - - - I - HMV - - - - - - I - - 98.2 99.0 99.5 - - - - I - IBM 93.6 95.2 96.5 - - - - I 65.0 88.8 99.6 - - - - - - I - INO - - 92.0 93.5 98.1 94.7 94.6 I - - 90.3 95.2 99.8 99.5 99.7 99.7 0.0 I 77.8 IRS - 81.4 74.2 - 51.6 - - I - 69.5 48.2 - 89.1 - - - - I - ITM - 81.0 81.2 65.8 64.2 - - I - 81.8 58.2 68.6 76.3 - - - - I - IVB 8.3 - - - 96.9 - - I - - - - - - - - - I - MR2 - - - - - 65.4 - I - - - - - 69.6 - - - I - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 I 80.7 86.4 98.7 99.8 99.7 98.6 97.4 97.0 -0.4 I 24.8 NOD - - - 96.9 - 96.9 98.3 I - - - - 99.8 100% 99.4 - - I - NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 I 13.3 96.6 99.2 90.8 - 99.6 99.9 99.9 0.0 I 83.7 PAN - - 67.8 - - - - I - - 73.0 - - - - - - I - PAV - 96.6 98.8 - 73.7 98.8 98.7 I - - 93.7 100% 99.5 98.8 99.9 - - I - PCC - - - - - - - I - 67.6 - - - - - - - I - PCV 67.9 - - - - - - I - - - - - - - - - I - PRO - - - - 35.5 - - I - - - - 81.5 - - - - I - RAV - - - 71.0 - - - I - - - 99.5 99.2 - - - - I - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 I 95.1 97.6 99.0 98.6 100% 100% 100% 100~ 0.0 I 85.6 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 I 87.4 89.1 98.4 98.6 - 98.4 98.4 - - I - TBA 95.5 93.7 92.1 93.2 - - - I 72.0 96.1 99.5 98.7 - - - - - I - TSC - - 50.4 56.1 39.5 51.6 - I - - 81.9 76.5 59.5 69.6 - - - I - TNT 58.0 - - - - - - I - - - - - - - - - I - VDS - 44.0 37.1 - - - - I 16.1 9.9 8.7 - - - - - - I - UKV I 0.0 - I 0.0 VET - 64.9 - - 65.3 - - I - 94.0 97.3 97.5 97.6 - - - - I - VIT - - - - - - 7.6 I - - - - - - - - - I - VRX - - - - - - - I - - - - - - - - - I - VBS 43.1 56.6 - 35.5 - - - I - - - - - - - - - I - VHU 19.3 - - - - - - I - - - - - - - - - I - VSA - - 56.9 - - - - I - - 80.6 - - - - - - I - VSP - - - 76.1 71.7 79.6 - I - - - - - - - - - I - VSW - - 56.9 - - - - I - - 83.0 - - - - - - I - VTR 45.5 - - - - - - I 6.3 - - - - - - - - I - XSC 59.5 - - - - - - I - - - - - - - - - I - ---------------------------------------+----------------------------------------------+----- Mean74.2 84.8 84.4 85.4 81.2 90.6 98.3 I 69.6 80.9 83.8 89.6 93.6 88.2 98.0 98.6 +.7 I 66.4 --------------------------------------------------------------------------------------+----- Remark: for abbreviations and details of products present only in previous tests, see related parts of VTC test report. Result for file virus detection will be published in the final report. Concerning rating of DOS scanners, the following grid is applied to classify scanners: - detection rate =100.0% : scanner is graded "perfect" - detection rate ~100% : scanner is graded "almost perfect" - detection rate above 95% : scanner is graded "excellent" - detection rate above 90% : scanner is graded "very good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall grade" for macro virus detection, the lowest of the related results is used to classify resp. scanners. If several scanners of the same producer have been tested, grading is applied to the most actual version (which is, in most cases, the version with highest detection rates). Only scanners where all tests were completed are considered; here, the most actual version with test completed was selected. (For problems in test: see 8problms.txt). With the prerequisite that all instantiations of ITW macro viruses are detected "perfectly" (=100%), it is worhthile to observe that NO AV product is "perfect" in this test whereas 3 products were "perfect" in last VTC test. The following list grades those scanners for their performance of zoo macro virus detection in unpacked forms: Grading the Detection of Zoo Macro Viruses: ------------------------------------------- "Perfect" DOS macro scanners: NONE "Almost perfect" macro scanner: CMD (100~ 100~) (-) SCN (100~ 100~) (-) "Excellent" DOS macro scanners: AVK (100~ 99.9) (-) FPR (100~ 99.9) (-) NVC (99,9 99,8) (=) INO (99,7 99,7) (=) Remark: "+" indicates that the procuts performed better than in last test; "=" means equal detection rate, and "-" indicates a lower detection rate. HOWEVER: Detection rates for script viruses (even on a small collection of VBS, mIRC and JavaScript viruses) is FAR FROM ACCEPTABLE, with mean detection just at 66.4% (BTW: rates are even lesser on Windoes platforms). Grading of script virus detection under DOS: -------------------------------------------- 1 products reaches "very good" detection (>95%): CMD (95.3) AND 2 products reach "good" detection (>90%): AVK (93,5), FPR (91,8) *************************************************************************** Results #1: AV companies seem to be less interested in DOS scanners which have reached a very high level of macro virus detection, with 4 products detecting "almost all" zoo viruses (though not "exactly all"). The detection rate for script viruses (esp. VBS, mIRC and JavaScript) is inacceptably low and needs more work. *************************************************************************** #1.1) The number of products submitted for DOS tests is now down to 10; with proliferation of Windows platforms, DOS scanners become less relevant, and AV companies concentrate on other platforms. #1.2) Detection rates for zoo macro viruses have reached a very high level (mean: 98.6), with 4 products detecting "almost all" macro viruses: AVK, CMD, FPR, SCN; and 2 detect also "almost all" instantiations: CMD and SCN. #1.3) HOWEVER: detection rates of script viruses (even a small collection of VBS, mIRC and JavaScript viruses) is FAR FROM acceptable, with mean detection rate as low as 66.4%: NO products is "perfect" or even "very good" (>95%) 3 products are "good": CMD (93,5), AVK (91,5), FPR (90.5) As related threats grow significantly, much more work must be invested. *************************************************************************** 4. Summary #2: Performance of DOS scanners on ITW testbeds: =========================================================== Concerning "In-The-Wild" viruses, a much more rigid grid must be applied to classify scanners, as the likelihood is significant that a user may find such a virus on her/his machine. The following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses for all instantiations (=files) is now a MUST. The following 7 (of 10) DOS products reach 100% for macro virus detection and are rated "perfect" in this category (alphabetically ordered): Grading the Detection of ITW macro viruses (DOS): ------------------------------------------------- "Perfect" DOS ITW macro scanners: AVK (100.0%) (+) CMD (100.0%) (=) DRW (100.0%) (=) FPR (100.0%) (=) INO (100.0%) (=) NVC (100.0%) (=) SCN (100.0%) (=) Remark: no ITW detection test was performed for script (=VBS, mIRC and JS) viruses as the related databases were very small when testbeds were frozen. ************************************************************** Results #2: ITW macro detection rates rather perfect! ************************************************************** #2.1) 7 (of 10) products detect all ITW macro viruses in all instantiations (samples): AVK, CMD, DRW, FPR, INO, VNC and SCN #2.2) AV companies seem to loose interest in DOS AV products. ************************************************************** 5. Summary #3: Macro Virus detection quality of W-98 AV products: ================================================================= As Windows-98 is an established platform for personal (less for professional) usage, and with the imminent deployment of its successor Windmows-Me (millennium edition), it is interesting to observe that the number of products has significantly grown. Table ESA lists developments for detection file and macro viruses; moreover, results for detection of a small set of "script" viruses (including VBS, mIRC and JavaScript viruses) are added in a new column. For details of W-98 macro and script virus detection, see result tables (6bDOS) and evaluation (7eval.txt).o virus Table ESA: Improvement of W-98 scanners from 1998-10 to 2000-08: ================================================================ Detection of: -Script- Scan --------- File Virus ------- + --------- Macro Virus -------- + Virus- SCAN 9810 9903 9909 0004 DELTA I 9810 9903 9909 0004 0008 DELTA I 0008 NER % % % % % I % % % % % % I % -----------------------------------+---------------------------------------- ACU - - - - - I - 97.6 - - - - I - AN5 - - 87.2 - - I - - 89.3 - - - I - ANT 91.3 - 86.5 92.8 - I 84.3 - 89.5 90.2 96.4 6.2 I 55.2 ANY - - - - - I 70.7 - - - - - I - AVA 96.6 97.6 97.2 97.5 0.3 I 96.7 95.9 93.9 94.3 94.1 -0.2 I 15.0 AVG - 87.3 87.0 85.4 -1.6 I - 82.5 96.6 97.5 97.9 0.4 I - AVK 99.6 90.8 99.8 99.7 -0.1 I 99.6 99.6 100 99.9 100 -0.1 I 91.2 AVP 99.9 99.9 99.8 99.9 0.1 I 100 99.2 100 99.9 100 -0.1 I 88.2 AVX - 74.2 75.7 77.4 1.7 I - - 98.7 94.5 99.0 4.5 I 61.4 CLE - - - - - I - - - - - - I 4.2 CMD - - 98.4 99.6 1.2 I - - 99.6 100 100% 0.0 I 93.5 DSS/DSE 99.9 99.9 * 99.8 - I 100 100 * 100 100% 0.0 I 95.8 DRW/DWW - 89.5 98.3 96.7 -1.6 I - 98.3 98.8 98.4 97.5 -0.9 I 59.8 ESA - - - 58.0 - I - - - 88.9 - - I - FPR/FMA - 93.9 99.4 99.7 0.3 I 92.4 99.8 99.7 100 - - I - FPW - - 99.2 99.6 0.4 I - - 99.9 100 100% 0.0 I 90.8 FSE 99.8 100 99.9 100 0.1 I 100 100 100 100 100% 0.0 I 96.7 FWN - - - - - I 99.6 99.7 99.9 99.8 - - I - HMV - - - - - I - 99.5 - - - - I - IBM 92.8 * * * - I 94.5 * * * - - I - INO 93.5 98.1 97.1 98.7 1.6 I 88.1 99.8 98.1 99.7 99.8 0.1 I 78.1 IRS 96.7 97.6 - - - I 99.0 99.5 - - - - I - ITM - 64.2 - - - I - - - - - - I - IVB - - - - - I 92.8 95.0 - - - - I - MKS - - - - - I - - - 97.1 - - I - MR2 - - 65.9 - - I - - 64.9 - - - I - NAV - 96.8 97.6 96.8 -0.8 I 95.3 99.7 98.7 98.0 97.7 -0.3 I 36.6 NOD - 97.6 98.3 98.3 0.0 I - 99.8 100 99.4 - - I - NV5 - - 99.0 - - I - - 99.6 - - - I - NVC 93.6 97.0 99.0 99.1 0.1 I - 99.1 99.6 99.9 99.9 0.0 I 83.7 PAV 98.4 99.9 99.6 100 0.4 I 99.5 99.5 86.7 99.9 100 0.1 I 90.2 PCC - 81.2 - - - I - 98.0 - - - - I PER - - - - - I - - - 53.7 67.2 13.5 I 18.0 PRO - 37.3 39.8 44.6 4.8 I - 58.0 61.9 67.4 69.1 1.7 I 12.1 QHL - - - - - I - - - 0.0 - - I 6.9 RAV 84.9 - 86.9 86.5 -0.4 I 92.2 - 98.1 97.9 96.9 -1.0 I 47.1 SCN 86.6 99.8 99.7 100 0.3 I 97.7 100 99.8 100 100 0.0 I 95.8 SWP 98.4 - 99.0 99.6 0.6 I 98.6 - 98.5 98.6 - - I - TBA 92.6 * * * - I 98.7 * * * - - I - TSC - 55.3 53.8 - - I - 76.5 64.9 - - - I - VBS - - - - - I 41.5 - - - - - I - VBW - 26.5 - - - I 93.4 - - - - - I - VET - 66.3 * * * I - 97.6 * * - - I - VSP - 86.4 79.7 78.1 -1.6 I - 0.4 0.3 - - - I - -----------------------------------+--------------------------------+-------- Mean 95.0 84.2 89,7 91.6 0.3 I 92.1 90.4 93,5 95.0 95.6 +1,3 I 61,0 -----------------------------------+----------------------------------------- Generally, the ability of W-98 scanners to macro zoo viruses "in the mean" is further improved from 95% to 95.6%). Moreover, 4 scanners are "perfect" (100% detection rate) as the detect ALL macro viruses in the zoo testbed, and 3 more are "almost" perfect (missing few viruses, with detection rate rounded to 100%). With 100% detection both of ALL ITW viruses in ALL instantiations (files), the following grid is applied to grade macro virus detection under W-98: 1) detection rate of ITW macro viruses: 100% (perfect) mandatory 2) detection rate of ITW macro objects: 100% (perfect) mandatory 3) detection rate of zoo macro viruses: 100% (perfect) or 100~ (almost perfect) or >99% (excellent) 4) detection rate of zoo macro objects: 100% (perfect) or 100~ (almost perfect) or >99% (excellent) Grading of best W-98 products: ------------------------------ 4 products are "perfect" as they detect ALL zoo and ITW macro viruses in all instantiations: CMD, DSE, PFW and FSE. 4 products are "almost perfect" as they detect ALL ITW macro viruses and all BUT ONE zoo macro viruses/samples: AVK, AVP, PAV and SCN. 3 products are "excellent" as they detect ALL ITW macro viruses and all BUT FEW zoo macro viruses/samples: NVC, INO and AVX. HOWEVER: Detection rates for script viruses (even on a small collection of VBS, mIRC and JavaScript viruses) is FAR FROM ACCEPTABLE, with mean detection rate (of 20 products) just at 61%. Grading of script virus detection under W-98: --------------------------------------------- 3 (out of 20) products reach 95% detection (="very good"): FSE(96,7), DSE(95,8), SCN(95,8) AND 4 products reach 90% detection (="good"): CMD(93,5), AVK(91,2), FPW(90,8), PAV(90) ********************************************************************* Results #3: Macro Virus detection rates under W-98 on high level but Script Virus detection rates insufficient. ********************************************************************* #3.1: Detection rates for macro viruses for scanners under Windows 98 is rather stable on a fairly high, though not perfect level: Perfect scanner (100%): 4 (last test: 3) CMD, DSE, PFW and FSE. Almost perfect scanners : 4 AVK, AVP, PAV and SCN. Excellent scanners (>99%): 3 (last test: 7) NVC, INO and AVX. #3.2: HOWEVER: Simlilar to DOS (see #1.3), detection rates of script viruses for W-98 are FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 3 products are "very good" (Y95%): FSE,DSE,SCN 4 products are "good" (>90%): CMD,AVK,FPW,PAV #3.3: AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-98 users! ********************************************************************** 6. Summary #4: Macro Virus detection quality of W-NT AV products: ================================================================= The number of scanners running under Windows NT is growing. Significantly more products were available for these tests, whereas traditional DOS products seems to be somewhat neglected (at least concerning submission of products and results). The following table summarizes results of file and macro virus detection under Windows-NT in last 7 VTC tests: Table ES4: Comparison: Macro Virus Detection Rates in last 7 VTC tests under Windows NT: ============================================================ Detection of: Script Scan --------- File Virus ------------- + ---------------- Macro Virus ----------- + Virus ner 9707 9802 9810 9903 9909 0004 Delta 9707 9802 9810 9903 9909 0004 0008 Delta 0008 -----------------------------------------+------------------------------------------+------ ANT 88.9 69.2 91.3 - 87.2 92.8 12.6 I 92.2 - 85.7 - 89.3 90.2 96.4 +6.2 I 55.2 ANY - - 69.7 - - - - I - - 70.5 - - - - - I - ATD - - - - - 100% - I - - - - - 99.9 - - I - AVA - 97.4 96.6 97.1 97.4 97.2 -0.2 I - 91.9 97.2 95.2 93.3 94.3 94.1 -0.2 I 15.0 AVG - - - 87.3 87.0 85.4 -1.6 I - - - 82.5 96.6 97.5 97.9 +0.4 I 45.8 AVK - - 99.6 90.2 99.8 99.7 -0.1 I - - 99.6 99.6 100% 99.9 100 +0.1 I 91.8 AVP - - 83.7 99.9 99.8 99.9 0.1 I - - 100% 99.2 100% 99.9 100 +0.1 I 88.2 AVX - - - 74.2 75.2 80.4 5.2 I - - - 98.9 98.7 94.5 99.0 +4.5 I 61.4 AW - 56.4 - - - - - I - 61.0 - - - - - - I - CLE - - - - - - - I - - - - - - - - I 4.2 CMD - - - - - 99.6 - I - - - - - 100% 100% 0.0 I 93.5 DRW/DWW - - 93.3 98.3 98.3 0.0 I - - - 98.3 98.8 98.4 97.5 -0.9 I 59.8 DSS/E 99.6 99.7 99.9 99.3 * - - I 99.0 100% 100% 100% * - - - I - ESA - - - - - 58.0 - I - - - - - 88.9 - - I - FPR/FMA - 96.1 - 98.7 99.4 - - I - 99.9 99.8 99.8 99.7 - - - I - FPW - - - - - 99.6 - I - - - - 99.7 100% 100% 0.0 I 90.8 FSE - 85.3 99.8 100% 99.9 100% 0.1 I - - 99.9 100% 100% 100% 100% 0.0 I 96.7 FWN - - - - - - - I - - 99.6 99.7 - 99.9 - - I - HMV - - - - - - - I - - 99.0 99.5 - - - - I - IBM 95.2 95.2 77.2 * * * * I 92.9 92.6 98.6 * * * * * I * INO - 92.8 - 98.1 98.0 98.7 0.7 I - 89.7 - 99.8 99.7 99.7 99.8 +0.1 I 78.1 IRS - 96.3 - 97.6 - - - I - 99.1 - 99.5 - - - - I - IVB - - - - - - - I - - 92.8 95.0 - - - - I - MKS - - - - - 78.0 - I - - - - - 97.1 - - I - MR2 - - - - 61.9 - - I - - - - 69.6 - - - I - NAV 86.5 97.1 - 98.0 97.6 96.8 -0.8 I 95.6 98.7 99.9 99.7 98.7 98.0 97.7 -0.3 I 36.6 NOD - - - 97.6 98.2 98.3 0.1 I - - - 99.8 100% 99.4 - - I - NVC 89.6 93.8 93.6 96.4 - 99.1 - I 96.6 99.2 - 98.9 98.9 99.9 99.9 0.0 I 83.7 NVN - - - - 99.0 - - I - - - - 99.5 - - - I - PAV 97.7 98.7 98.4 97.2 99.6 100% 0.4 I 93.5 98.8 99.5 99.4 99.7 99.9 100 +0.1 I 90.2 PCC 63.1 - - - - - - I - 94.8 - - - - - - I - PER - - - - - - - I - - - - - - 85.0 - I 0.0 PRO - - - 37.3 42.4 45.6 3.2 I - - - 58.0 61.9 67.4 69.1 +1.7 I 13.1 QHL - - - - - - - I - - - - - 0.0 - - I 6.9 RAV - 81.6 84.9 85.5 - 88.0 - I - 98.9 99.5 99.2 - 97.9 96.9 -1.0 I 47.1 RA7 - - - 89.3 - - - I - - - 99.2 - - - - I - SCN 94.2 91.6 71.4 99.1 99.8 99.8 0.0 I 97.6 99.1 97.7 100% 100% 100% 100% 0.0 I 95.8 SWP 94.5 96.8 98.4 - 99.0 99.6 0.6 I 89.1 98.4 97.5 - 98.4 98.6 - - I - TBA - 93.8 92.6 * * * - I 96.1 - 98.7 * * * - I - TNT - - - * * * - I - - 44.4 * * * - I - VET 64.9 - - 65.4 * * - I - 94.0 - 94.9 * * - I - VSA - 56.7 - - - - - I - 84.4 - - - - - I - VSP - - - 87.0 69.8 78.1 8.3 I - - - 86.7 0.3 0.0 - - I - -----------------------------------------+------------------------------------------+------- Mean: 87.4 88.1 89.0 89.2 90,0 91.0 1.8% 94.7 95.9 91.6 95.3 95,1 96.5 96.3 +0.6 I 57.7 -----------------------------------------+------------------------------------------+------- Generally, "mean" detection rate of macro viruses is stable on an acceptable though not overly good level (96.3%); again, products having participated in last VTC tests succeed in following the growth by keeping detection rates (+0.6%) on high levels, from where spectacular improvements are less easy. With the same grid as for W-98 products, the following grades are assigned to W-NT related AV products concerning macro virus detection: Grading of best W-NT products: ------------------------------ 4 products are "perfect" as they detect ALL zoo and ITW macro viruses in all instantiations: CMD, FPW, FSE and SCN. 3 products are "almost perfect" as they detect ALL ITW macro viruses and all BUT ONE zoo macro viruses/samples: AVK, AVP and PAV. 3 products are "excellent" as they detect ALL ITW macro viruses and all BUT FEW zoo macro viruses/samples: NVC, INO and AVX. HOWEVER: regarding the detection of script viruses, the situation is worse than that one reported under DOS (mean value: 57,7%) and even worse than under W-98 (mean rate: 61%), as only 57.7% are detected "in the mean". Grading of script virus detection under W-NT: --------------------------------------------- NO product is "perfect" (100%) or "excellent" (>99%) 2 products are "very good": FSE(96.7%), SCN(95.8%) 4 products are "good": CMD(93.5%),AVK(91.8%),FPW(90.8%),PAV(90.2%) ******************************************************************** Results #4: Macro virus detection rates under W-NT on high level ---------------------------------------------------- #4.1) Detection rates for macro viruses for scanners under Windows NT is rather stable on a fairly high, with 4 products on "perfect" level: Perfect scanner (100%): 4 (last test: 1) CMD, FPW, FSE, SCN. Almost perfect scanners : 3 AVK, AVP, PAV. Excellent scanners (>99%): 3 (last test: 8) NVC, INO, AVX. #4.2) HOWEVER: similar to DOS (see #1.3) and W-98 (see #8.2), detection rates of script viruses for W-98 is FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 2 products are "very good": FSE, SCN 4 products are "good": CMD,AVK,FPW,PAV #4.3) AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-NT users! ******************************************************************** 7. Summary #5: Macro Virus detection quality of W-2k AV products: ================================================================= With the recent advent of Windows-2000 aimed at business applications, there is yet a large number of AV products for this platform. Among the platforms used for testing, this was the only one where no crash or other major problem was reported; this may indicate that this plat- form is more stable than any previous Windows platform of MicroSoft. The following table summarizes results of macro and script virus detection under Windows-NT in this test. Table ES5: Macro/Script Virus Detection Rates under W-2k: ========================================================= Detection of: Scan - File Virus - + - Macro Virus - + - Script Virus - ner I 0008 I 0008 ---------------------+-----------------+----------------- ANT I 93.3 I 53.9 AVA I 94.1 I 15.0 AVG I 97.9 I 45.8 AVK I 100.0 I 91.5 AVP I 100.0 I 88.2 AVX I 99.0 I 61.4 CLE I - I 4.2 CMD I 100.0% I 93.5 DRW I 97.5 I 59.8 FPW I 100.0% I 90.8 FSE I 100.0% I 96.7 INO I 99.8 I 78.1 NAV I 97.7 I 36.6 NVC I 99.9 I 83.7 PAV I 100.0 I 90.2 PER I 85.0 I 0.0 PRO I 69.1 I 12.1 QHL I 0.0 I 6.9 RAV I 96.9 I 47.1 SCN I 100.0% I 95.8 ---------------------+-----------------+----------------- Mean: I 91.1% I 57.6% ---------------------+-----------------+----------------- With the same grid used for evaluation of AV products under W-98 and W-NT, the following grades are assigned: Grading of best W-2k products: ------------------------------ 4 products are "perfect" as they detect ALL zoo and ITW macro viruses in all instantiations: CMD, FPW, FSE and SCN. 3 products are "almost perfect" as they detect ALL ITW macro viruses and all BUT ONE zoo macro viruses/samples: AVK, AVP and PAV. 3 products are "excellent" as they detect ALL ITW macro viruses and all BUT FEW zoo macro viruses/samples: NVC, INO and AVX. In comparison with macro virus detection (mean value: 91.1%), detection of script viruses is unsufficient (mean value: 57.6%). But it is interesting to observe that detection rates of several products are more than 1.0% higher than those for W-98 and W-NT products: Grading of script virus detection under W-2k: --------------------------------------------- NO product is "perfect" (100%) or "excellent" (>99%) 3 products are "very good": FSE(97.2), SCN (96.6), CMD (95.3) 5 products are "good": AVK (93,9), AVP (92.4), FPW(92.4), PAV(92.4) ********************************************************************* Results #5: Macro virus detection rates under W-2k on high level ---------------------------------------------------- #5.1: Detection rates for macro viruses for scanners under Windows 2000 is starting on a high level, with 4 products on "perfect" level: Perfect scanner (100%): CMD, FPW, FSE, SCN. Almost perfect scanners : AVK, AVP, PAV. Excellent scanners (>99%): NVC, INO, AVX. #5.2: HOWEVER: similar to all other platforms, detection rates of script viruses for W-2k is FAR FROM acceptable: NO product is "perfect" (100%) or "excellent" (>99%) 3 products are "very good": FSE, SCN, CMD 5 products are "good": AVK, AVP, FPW, PAV #5.3: AV producers must invest significantly more work and quality into detection of script viruses as this threat is significantly growing for W-2k users! ********************************************************************** 8. Summary #6: Detection of packed viral objects (DOS/W-32 platforms): ====================================================================== As many macro objects are transferred over insecure networks in compressed for, it is essential that AV products are able to detect macro viruses also in packed objects. VTC tests the related detection quality of AV products by packing ITW macro viruses with 4 broadly used packers: PKZIP, LHA, ARJ and RAR. The following grid is applied: A "perfect" scanner would detect all ITW viruses in all 4 packed forms with 100% An "excellent" scanner would detect all ITW viruses in at least 3 packed forms with 100% A "very good" scanner would detect all ITW viruses in at least 3 packed forms with 100% In comparison with last VTC test (1999-09), situation has slightly improved but needs further work: ---------------------------------------------------------------------- "Perfect" packed DOS macro virus detectors: AVK,CMD,SCN "Perfect" packed W-32 macro virus detectors: AVK,AVP,AVX,CMD,PAV,SCN ---------------------------------------------------------------------- "Excellent" packed DOS macro virus detector: --- "Excellent" packed W-32 macro virus detector: AVG,NAV ---------------------------------------------------------------------- "Very good" packed DOS macro virus detector: FPR,INO "Very Good" packed W-32 macro virus detector: FPW,INO,RAV,NVC ---------------------------------------------------------------------- ******************************************************************** Results #6: Detection rates of packed macro viral objects improving on all platforms! ---------------------------------------------------- #6.1) Perfect scanners detect packed macro viruses both under DOS and all W-32 platforms for ALL 4 methods: AVK, CMD and SCN #6.2) Some pefrect scanners detect packed macro viruses under all W-32 platforms for all 4 products: AVP, AVX and PAV #6.3) But there is still need for improvement. ******************************************************************* 9. Summary #7: False Positive avoidance (DOS/W-32 platforms): ============================================================= Regarding the ability of scanners to avoid FP alarms under DOS, only 4 (of 10) products gave NO FP alarm upon the testbed; compared to last test where 7 scanners were"perfect", this is a less favourable situation. On the other side, the situation has significantly improved for W-32 products (W-98, W-NT, W-2k), where 8 (last test for W-NT: 7) products now give NO false alarm on clean macro objects. FP-avoiding "perfect" DOS scanners: ANT(=),AVK(+),INO(+),SCN(=) FP-avoiding "perfect" W-98 scanners: AVA, AVG, AVK, AVP, DSE, INO, PAV, PRO, SCN FP-avoiding "perfect" W-NT/W-2k scanners: AVA, AVG, AVK, AVP, INO, PAV, PRO, SCN The following products avoid any false alarm under ANY platform: AVK, INO, SCN (Remark: direct comparison of 16-bit scan engines for DOS and 32-bit scan engines for W-NT is not possible. The argument concerning an "overall perfect product" applies more to the suite of software than to single products. Indeed, FPW and FPR are different engines in Frisk Software suite, as SCN engines are in NAIs suite). ***************************************************************** Results #6: Avoidance of False-Positive Alarms is improving: ---------------------------------------------------- #6.1) FP-Avoiding "perfect" products, ALL platforms (3): AVK, INO, SCN #6.2) FP-avoiding perfect DOS scanners (4): ANT, AVK, INO, SCN #6.3) FP-avoiding perfect W-32 scanners (8): AVA, ANG, AVK, AVP, INO, PAV, PRO, SCN #6.4) There is still need for improvement in FP avoidance. ***************************************************************** 10. Summary #8: Detection of Macro Malware (DOS/W-32 platforms): ================================================================ Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be warned and protected not only about viruses but also about other malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Regrettably, consciousness of AV producers to protect their users against related threats is still underdevelopped. Manifold arguments are presented why AV products are not the best protection against non-viral malware; from a technical point, these arguments may seem conclusive but at the same time, almost nothing is done to support customers with adequate AntiMalware software. On the other side, AV methods (such as scanning for presence or absence of characteristic features) are also applicable - though not ideal - to detect non-viral malware. Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. Looking only at the macro malware results (which have been better than related file malware results in all previous tests), TWO AV products succeed in properly detecting ALL non-viral malware samples under ALL platforms (DOS and W-32 platforms); accordingtly, these products can fairly be regarded as AntiMalware products: ----- Macro Malware Detection ------ == DOS = = W-NT = = W-98 = = W-2k = ------------------------------------ FPR/FPW: 100% 100% 100% 100% CMD 100% 100% 100% 100% FSE ---- 99.7 100% 100% SCN 98.8 97.0 100% 100% ------------------------------------- 2 products detect macro malware under ALL platforms "perfectly": CMD, FPR/FPW 2 more products detects macro malware under W-NT and W-2k: FSE, SCN **************************************************************** Results #7: Macro Malware detection under DOS/W-2k improving: ------------------------------------------------ #7.1) "Perfect" AntiMalware products for DOS/W-2k CMD, FPR/FPW #7.2) "Excellent" AntiMalware products for DOS/W-2k: FSE, SCN **************************************************************** 11. Summary #9: File/Macro Virus detection under 32-bit engines: ================================================================ Concerning 32-Bit engines as used in Windows-98, Windows-NT and Windows-2000, it is interesting to test the validity of the hypothesis that related engines produce same detection and identification quality. (For details see 6HCOMP32.TXT). When comparing results from related tests, good news is that 32-bit engines growingly behave equally well on all 32-bit platforms: Equal detection of zoo file viruses: 14 (of 19) products of ITW file viruses: 17 (of 19) products of zoo script viruses: 15 (of 20) products ****************************************************************** Results #10: Many W-32 scanners perform equally on W-32 platforms ****************************************************************** Result #10.1: The assumption that 32-bit engines in scanners produce the same detection rate for different instantiations of 32-bit operating systems (esp. for W-98, W-NT and W-2000) is now correct for the majority of scanners. #10.2: Analysis of ITW detection rates is NOT sufficient to determine the behaviour of 32-bit engines and does not guarantee equal detection rates for different W-32 platforms (esp. W-98/W-NT). ***************************************************************** 12. Final remark: Searching the "Perfect AV/AM product" ===================================================== Under the scope of VTCs grading system, a "Perfect AV/AM product" would have the following characteristics: Definition: A "Perfect AntiVirus (AV) product" ---------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" in at least 99% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in EVERY infected sample, 2) Will detect ALL ITW viral samples and in at least 99% of zoo samples, in compressed objects, packed with all (4) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition: A "Perfect AntiMalware (AM) product" ------------------------------------------------ 1) Will be "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ************************************************************* In VTC test "2000-08", we found *** NO perfect AV product *** and we found *** No perfect AM product *** ************************************************************* But some products seem to approach our definition on a rather high level, taking into account the highest value of "perfect" defined on 100% level, "almost perfect on (rounded) ~100% level nd "Excellent" defined by 99% for virus detection, and 90% for malware detection (and "Very Good" as best grade for script virus detection): Test category: "Perfect" "Almost perfect" "Excellent" "Very Good" --------------------------------------------------------------------------- DOS macro test: --- CMD,SCN AVK,FPR,NVC,INO --- DOS ITW tests: AVK,CMD,DRW, --- --- --- FPR,INO,NVC,SCN DOS script test: --- --- --- --- W-98 macro tests: CMD,DSE,FPW,FSE AVK,AVP,PAV,SCN NVC,INO,AVX --- W-98 script test: --- --- --- FSE,DSE,SCN W-NT ZOO/ITW test: CMD,FPW,FSE,SCN AVK,AVP,PAV NVC,INO,AVX --- W-NT script test: --- --- --- FSE,SCN W-2k macro test: CMD,FPW,FSE,SCN AVK,AVP,PAV NVC,INO,AVX --- W-2k script test: --- --- --- FSE,SCN,CMD DOS packVirus test: AVK,CMD,SCN --- --- FPR,INO W-32 packVirus test: AVK,AVP,AVX, --- AVG,NAV FPW,INO,RAV,NVC CMD,PAV,SCN DOS FP avoidance: ANT,AVK,INO,SCN --- --- --- W-32 FP avoidance: AVA,AVG,AVK, AVP,PAV,PRO,SCN --- --- --- ---------------------------------------------------------------------------- DOS Malware test: FPR/FPW,CMD --- --- --- W-32 Malware test: FPR/FPW,CMD --- FSE,SCN --- ---------------------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this test with a simple algorithm, by counting the majority of places, with following weights: "perfect" 2.0 points, "almost perfect" 1.5 points, "excellent" 1.0 points, and "very good" 0.5 points. This results in the following grades (until 10 place/4 points), also indicating changes in places versus last test "1999-09", although evaluation has been seriously changed): ************************************************************ "Perfect" AntVirus product: NONE ************************************************************ "Excellent" AV products: (last test) 1st place: SCN 18.5 points (4) 2nd place: AVK 15.5 points (-) 3rd place: CMD 14.0 points (-) 4th place: FPR/FPW 10.0 points (2) 5th place: INO 9.0 points (-) 6th place: AVP 8.5 points (3) 7th place: FSE 7.5 points (1) 8th place: NVC 6.5 points (-) 9th place: AVX 5.0 points (-) 10th place: DRW 4.0 points ************************************************************ "Perfect" AntiMalware product: NONE ************************************************************* "Excellent" AntiMalware product: 1st place: SCN 19.5 points (1) 2nd place: CMD 14.0 points (-) 2nd place: FPR/FPW 14.0 points (-) 4th place: FSE 8.5 points (-) ************************************************************ Generally, we hope that these rather detailed results help AV producers to adapt their products to growing threats and thus to protect their customers. 13. Availability of pre-released test results: ============================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. A new test concerning the ability of AV products to reliably CLEAN virally infected macro objects (ART = Antivirus Repair Test) will be published early October 2000. The next comparative test is planned for December 2000 until January 2001, with viral databases frozen on October 31, 2000. Any AV producer wishing to participate in forthcoming test is invited to submit related products. On behalf of the VTC Test Crew: Dr. Klaus Brunnstein (September 24, 2000) 14. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 2000 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at non-commercial sites (including universities though not for any commercial endeavour) or other public mirror sites where security/safety related information is stored for unrestricted public access AND for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, is ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Prof. Dr. Klaus Brunnstein Faculty for Informatics University of Hamburg, Germany (September 24, 2000)