============================================================ File 0XECSUM.TXT: "EXECUTIVE SUMMARY" AntiVirus/AntiMalware Product Test "2003-04" antiVirus Test Center (VTC), University of Hamburg ============================================================ [Formatted with non-proportional font (Courier), 72 columns] Remark: in this test (2003-04), 10 selected products were tested under same conditions as in test "2002-12", though under Windows XP. Consequently, results can be compared ********************************************************************** Content of this file: ********************************************************************** 0. Editors Foreword 1. Background of this test; Malware Threats Table ES0: Development of viral/malware threats 2. VTC Testbeds used in VTC tests "2002-12" and "2003-04" Table ES1: Content of VTC test databases in test "2002-12" and "2003-04" 3. Products participating in test 2003-04 Table ES2: List of AV products in test 2003-04 3A. aVTCs Evaluation and Grading scheme 4. Problems in test "2003-04" 5-XP. Results of on-demand detection under Windows-XP: Table WXP-A: Results of WXP scanners for file, macro and script viruses/malware Findings WXP.1 - WXP.7 Grading WXP products according to their detection performance 9. Comparison of detection behaviour for W32 platforms Grading AV products concerning W32-harmonical behaviour 12. Availability of full test results 13. Copyright, License, and Disclaimer *********************************************************************** 1. Editors Foreword: ==================== VTC test "2002-04" is aVTCs first test of AV products under Windows-XP. This test is performed - for comparability - on the same platform which was used in test "2002-12" for AV products under W-98 and X-2000 as well as DOS and LINUX. For details of testbeds, see aVTC test "2002-12". aVTC test "2003-04" has been performed on a limited number of clients which permitted running WXP. Fortunately, next WXP-related tests will be easier as the Faculty for Informatics at Hamburg University has supported the test-lab with more clients capable of running WXP. This test - as all previous ones - has been performed by students of Hamburg university Faculty for Informatics with special interest in IT Security (see our 4-semester curriculum started in 1988, on our homepage). Different from other tests where submitters of products have to pay a fee for being admitted to tests, VTC tests are "FREE OF FEE". This implies that students who have to complete their examinations and usually also work to earn their income are only "partially available" for tests. Moreover, our hardware which is essentially funded by Faculty support (sometimes also by donation of new machines, usually more powerful than those which we can buy from university money) canNOT compete with the technical equipment in other test-labs. We regret that these circumstances cause delays in perfor- ming and publishing our regular test reports, but instead of hurrying to meet dates and expectations, we insist that assessed QUALITY of our test results shall have - also in the future - highest priority. Most work in VTC tests rest on the shoulders of our test crew, and the editor wishes to thanx them all for their devotion and hard work. (See VTC test team at the end of this report). 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) including viruses (=self-replicating malware), trojan horses (=pure payload without self-replication), virus droppers and network malware (e.g. worms and hostile applets), are regarded as serious threats to PC users esp. when connected to Intranetworks and Internet. The development of malicious software can well be studied in view of the growth of VTC (zoo and In-The-Wild) testbeds. The following table summarizes, for previous and current VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== -------------------------------+---------------+------------------------+------------------------ == FileViruses/Malware ==I =Boot Viruses=I =Macro Viruses/Malware=I=ScriptViruses/Malware = Test# Number Infected NumberI Number Infected Number Infected Number I Number Infected Number Viruses objects MalwareI viruses objects viruses objects malwareI viruses objects malware -------------------------------+---------------+------------------------+------------------------ 1997-07: 12,826 83,910 213 I 938 3,387 I 617 2,036 72 I --- --- --- 1998-03: 14,596 106,470 323 I 1,071 4,464 I 1,548 4,436 459 I --- --- --- 1998-10: 13,993 112,038 3,300 I 881 4,804 I 2,159 9,033 191 I --- --- --- 1999-03: 17,148 128,534 3,853 I 1,197 4,746 I 2,875 7,765 200 I --- --- --- VKIT/Poly:+5 146,640 I I I 1999-09: 17,561 132,576 6,217 I 1,237 5,286 I 3,546 9,731 329 I --- --- --- VKIT/Poly:+7 166,640 I I I 2000-04: 18,359 135,907 6,639 I 1,237 5,379 I 4,525 12,918 260 I --- --- --- VKIT/Poly:+7 166,640 I I I 2000-08: --- --- --- I --- --- I 5,418 15,720 500 I 306 527 --- 2001-04: 20,564 140,703 12,160 I 1,311 5,723 I 6,233 19,387 627 I 477 904 --- VKIT/Poly:+7 166,640 I I I 2001-07H1: --- --- --- I --- --- I + 544 + 2,035 +102 I +206 + 386 --- 2001-10: --- --- I --- --- I 6,762 21,677 683 I 481 1,079 30 2002-02H2: --- --- --- I --- --- I + 675 + 3,245 +720 I +854 +1,645 +270 -------------------------------+---------------+-----------------------+------------------------ TESTBED OF THIS TEST: 2002-12: 21,790 158,747 18,277 I ITW:11 149 I 7,306 25,231 747 I 823 1,574 202 2003-04: 21,790 158,747 18,277 I --- --- I 7,306 25,231 747 I 823 1,574 -------------------------------+---------------+-----------------------+------------------------ Remark #1: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. Since test 1999-03, separate tests are performed to evaluate detection rates of VKIT-generated and selected polymorphic file viruses. Remark #2: Heureka tests (marked "H#") use only those samples (viruses and malware) which were newly found (marked "+") in a given period (for details, see related test reports); Heureka tests include macro and script viruses/malware. For details of testbeds, see report of aVTC test "2002-12". 2. VTC Testbeds used in VTC test "2003-04": =========================================== The current sizes of different VTC testbeds (developed from previous testbeds through inclusion of new viruses and malware and some revision) is given in the following table (for detailed indices of VTC testbeds, see file "a3testbed.zip") Table ES1: Content of VTC test databases in test "2002-12/2003-04": ========================================================================== "Full Zoo": 21,790 File Viruses in 158,747 infected files 8,001 different File Malware in 18,277 files 664 Clean file objects for False Positive test 7,306 Macro Viruses in 25,231 infected documents 450 different Macro Malware in 747 macro objects 329 Clean macro objects for False Positive test 823 different script viruses in 1,574 infected objects 117 different script malware in 202 macro objects ----------------------------------------------------------------- "ITW Zoo": 11 Boot Viruses in 149 infected images/sectors 50 File Viruses in 443 infected files 124 Macro Viruses in 1,337 infected documents 20 Script Viruses in 122 infected objects ============================================================================ For a survey of platforms, see A4tstdir.txt, and for the content of the resp. testbeds see A3TSTBED.zip (available for download). Concerning quality of viral testbeds, it is sometimes difficult to assess the "virality" (=ability of a given sample to replicate at least twice under given constraints) of large "viral zoo" databases, esp. as some viruses work under very specific conditions. We are glad to report, that Dr. Vesselin Bontchev, Eugene Kaspersky and Dr. Igor Muttik and other experts esp. from the "non" Organisation of cooperating AV experts (Computer Antivirus Research Organisation, CARO) helped us significantly with critical and constructive comments to establish viral testbeds, the residual non-viral part of which should be very small. Post-analysis and Quality Measure of VTC testbeds: -------------------------------------------------- Those samples which some product didnot properly detect are usually sent to a trustworthy expert from that company for post-analysis. In almost all cases, newer versions of that product were able to detect previously missed samples. We were esp. happy to receive comments from Vesselin Bontchev, Eugene Kaspersky and Igor Muttik about some specimen which should not have been included in one testbed (though possibly belonging to another testbed. Indeed, some samples contained remainders of some improperly cleaned virus. After an analysis of those comments, we can report that a VERY SMALL number of entries in few testbeds does NOT belong there: Number of Number of objects Inaccuracy Improper samples in testbed ratio ----------------+------------------+---------- Zoo File testbed: 25 I 158,747 I 0,0006% Zoo Macro testbed: 5 I 25,231 I 0,02% Zoo Script testbed: 3 I 1,337 I 0,2% ----------------+------------------+---------- We also wish to thank "WildList Organisation" for supporting us with their set of In-The-Wild viruses; the related results may support users in comparing VTC tests with other ITW-only tests. 3. Products participating ins Test "2003-04": ============================================= For test "2003-04", the following *** 10 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviation) under Windows-XP were tested: Table ES2: List of AV products in test "2003-04" ================================================ Abbreviation/Product/Version ------------------------------------ AVP = Kaspersky AntiVirus 3.0 (135) BDF (AVX) Bit Defender CMD = Command Software 4.62.4 DRW = DrWeb 4.26 FSE = FSecure AntiVirus 1.00.1251 INO = Inoculan 6.0 NAV = Norton AntiVirus (Corp.Ed.) NVC = Norman Virus Control 5.30.02 RAV = Rumanian AntiVirus 8.02.001 SCN = NAI VirusScan 4.16.0 ------------------------------------ For details of AV products including options used to determine optimum detection rates: see A3SCNLS.TXT. For scanners where results are missing: see 8problms.txt. Concerning frequently asked questions, some AV producers deliberately do NOT submit their products and even FORBID VTC to test their product:: TrendMicro Germany has again recently informed the author of this report that they are NOT interested in VTC test participation as their scanner is deliberately trimmed to on-access scanning and detection of In-The-Wild viruses. As VTC emphasize also the detection of zoo viruses where their products would produce "unfavourable results", there is still no intent to submit their products. When 2 experts from TrendMicro asked to test their product but NOT TO PUBLISH results but report them only to TM lab, VTC - as university lab devoted to public information - had to refuse that "offer". Consequently, VTC refrains from inviting TrendMicro for future test participation. Panda has permitted tests to any institution and university *except VTC*. Background: after first participation of a product (with less positive results), Panda CEO requested that VTC transfers the whole testbed to Panda labs, as condition for future test participation. VTCs policy is to send missed samples to participating companies, but sending the whole testbed is inconsistent with VTCs security policy (see VTC Code of Conduct). Sophos: as mentioned in previous test reports, there have been problems so that VTC could not include products in its test. We are glad to report that these problems have meanwhile been solved, so Sophos AntiVirus will be included (again) in the next test (2003-09). Detailed results including precision and reliability of virus and malware identification (including the grids used to assign a performance level to each product) are presented in files (platform-specific): for WXP: 6jwxp.txt comparison of W32 results: 6mcmp32.txt In a rather detailed analysis, detection rates are presented for each platform (operating systems), and product behaviour is graded in compa- rison with all products tested on ther resp. platform: evaluation/grading for W-XP products: 7evalwxp.txt for W32 products: 7evalcmp.txt 3A. aVTCs Evaluation and Grading scheme: ======================================== Under the scope of VTCs grading system, a "Perfect AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND at least 99.9% of zoo samples, in ALL categories (file, boot, macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (NOW:5) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Remark: detection of "exotic viruses" is presently NOT rated. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). Remark: detection of "exotic malware" is presently NOT rated. 4. Problems in this test: ========================= In comparison with aVTC tests concerning W32 platforms (esp. W98 and W2k), most products behaved in a "rather beenevolent manner", as only 2 products crashed (for details, see 8problms.txt): 4 products had NO PROBLEM: BDF,DRW,FSE and SCN 4 products had MINOR PROBLEMS (requiring 1 or 2 postscans): AVP,CMD,RAV and NVC 2 products crashed on specific files: INO, NAV 5-XP. Results of on-demand detection under Windows-XP (WXP): ============================================================ This is a summary of the essential findings for AV/AM products under WXP. For details see 7evalwXP.txt. Results of this (first) test of AV products under WXP are shown in the following table: Table WXP-A: File/Macro/Script Zoo Virus Detection Rates: ================================================================ Scan I == File Virus == + == Macro Virus == + == Script Virus == ner I Detection I Detection I Detection -----+------------------+--------------------------------------- Test I 0304 I 0304 I 0304 -----+------------------+--------------------------------------- AVP I 100.~ I 100.~ I 98.9 BDF I 82.9 I 99.0 I 72.4 CMD I 98.5 I 99.9 I 89.1 DRW I 98.3 I 99.4 I 94.7 FSE I 100.~ I 100.~ I 99.5 INO I 98.7 I 99.9 I 94.7 NAV I 98.3 I 99.6 I 96.8 NVC I 97.8 I 99.8 I 87.6 RAV I 96.7 I 99.9 I 96.1 SCN I 99.8 I 100.0 I 99.6 -----+------------------+-------------------+------------------- Mean : 97.1% I 99.8% I 92.9% Mean>10%: 97.1% I 99.8% I 92.9% -----+------------------+-------------------+------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Concerning file zoo virus detection, NO product is able to detect ALL viruses (rating: "perfect"), but several products detect more than 99% and are rated "excellent": AVP and FSE (both 100~), SCN (99.8%) Concerning macro zoo virus detection, ONE product detects ALL viruses and is rated "perfect": SCN (100.0%) In addition, all other (9) products detect >99% of viruses and are rated "excellent": AVP,FSE (all: 100~), CMD,INO,RAV (all: 99.9%), NAV,NVC (both 99.8%), DRW (99.4%) and BDF (99.0%). Concerning script zoo virus detection, NO product detects ALL viruses, and only 2 products detect more than 99% and are rated "excellent": SCN (99.6%), FSE (99.5%). Findings WXP.1: Detection rates for file/macro/script zoo viruses: ------------------------------------------------------------------ For this selection of products, mean detection rates of zoo viruses are rather high: mean file zoo virus detection rate: 97.1% mean macro virus detection rate: 99.8% mean script virus detection rate: 92.9% ------------------------------------------------ Concerning file zoo viruses: NO product detects ALL viruses ("perfect") 3 products detect more than 99% and are rated "excellent": AVP,FSE;SCN ------------------------------------------------ Concerning macro zoo viruses: 1 products detects ALL macro zoo viruses in all files and is rated "perfect": SCN 9 products detect almost all macro viruses in almost all files and are rated "excellent": AVP,FSE;CMD,INO,RAV;NAV,NVC;DRW;BDF ------------------------------------------------ Concerning script zoo viruses: NO product detects ALL viruses ("perfect") 2 products detect almost all script viruses in almost all files and are rated "excellent": SCN,FSE ------------------------------------------------- Findings WXP.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- 5 AV products (out of 10) detect ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files) and are rated "perfect": AVP,DRW,FSE,INO,SCN 2 more scanners are "excellent": INO,RAV ------------------------------------------------- Concerning detection of ITW file viruses only: 5 "perfect" scanners: AVP,DRW,FSE,NAV,SCN 2 more "excellent" scanners: INO,RAV ------------------------------------------------- Concerning detection of ITW macro viruses only: 6 "perfect" scanners: AVP,DRW,FSE,INO,NAV,SCN 4 "excellent" scanners: BDF,CMD,NVC,RAV ------------------------------------------------- Concerning detection of ITW script viruses: 8 "perfect" scanners: AVP,CMD,DRW,FSE,NAV,NVC,RAV,SCN 1 more "excellent" scanner: INO ------------------------------------------------- Findings WXP.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- NO WXP product is overall rated "perfect" 2 "excellent" overall scanners: SCN,FSE 3 "very good" overall scanners: AVP,NAV,RAV ------------------------------------------------- Findings WXP.4: Performance of WXP scanners by virus classes: ------------------------------------------------------------- Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVP,FSE,SCN Very Good scanners for file zoo: INO,CMD,DRW,NAV,NVC,RAV ------------------------------------------------------------ Perfect scanners for macro zoo: SCN Excellent scanners for macro zoo: AVP,FSE,CMD,INO,RAV,NAV,NVC,DRW,BDF Very Good scanners for macro zoo: --- ------------------------------------------------------------ Perfect scanners for script zoo: --- Excellent scanners for script zoo: SCN,FSE Very Good scanners for script zoo: AVP,NAV,RAV ------------------------------------------------------------ Findings W98.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Evaluation grid: a "perfect" product will detect ALL viruses in objects packed with ALL 5 packers an "excellent" product will detect ALL viruses in objects packed with 4 packers a "very good" product will detect ALL viruses in objects packed with 3 packers Concerning OVERALL detection of packed file AND macro viruses, 1 products is "perfect": SCN 2 products are "excellent": AVP,DRW 1 product is "very good": FSE ------------------------------------------------------- Concerning detection of packed file viruses only: "Perfect" packed file virus detectors: AVP,FSE,SCN "Excellent" packed file virus detectors: DRW "Very Good" packed file virus detectors: --- ------------------------------------------------------- Concerning detection of ALL packed macro viruses: "Perfect" packed macro virus detectors: SCN "Excellent" packed macro virus detectors: AVP,DRW "Very Good" packed macro virus detectors: RAV ------------------------------------------------------- Findings WXP.6: Avoidance of False Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is rather well developped, at least for file-FP avoidance. 4 Overall FP-avoiding perfect WXP scanners: BDF,INO,NAV,SCN --------------------------------------------------- Concerning file-FP avoidance, 9 (of 10) products are "perfect": AVP,BDF,CMD,FSE,INO,NAV,NVC,RAV,SCN And 1 more product is "excellent": DRW --------------------------------------------------- Concerning macro-FP avoidance, these 4 products are "perfect": BDF,INO,NAV,SCN And 1 more product is "excellent": RAV --------------------------------------------------- Findings WXP.7: Detection rates for file/macro/script malware: -------------------------------------------------------------- Generally, detection of malware is insufficient as is indicated by mean detection rates mean detection rate for file malware: 81.3% for macro malware: 96.6% for script malware: 67.8% --------------------------------------------------- Overall File/Macro/Script Malware detection for ALL platforms needs significant improvement: 0 products are "perfect": --- 3 products are "excellent": FSE,AVP,SCN 1 products is "very good": RAV --------------------------------------------------- Concerning only file malware detection: 0 product is "perfect": --- 3 products are "excellent": FSE,AVP,SCN 1 product is rated "very good": RAV --------------------------------------------------- Concerning only macro malware detection: 3 products are "perfect": AVP,FSE,SCN 6 products are "excellent": CMD,RAV,NVC,INO,NAV,BDF,DRW 0 product is rated "very good": --- --------------------------------------------------- Concerning only script malware detection: 0 products are "perfect": --- 4 products are "excellent": SCN,FSE,AVP,NAV 1 product is rated "very good": RAV --------------------------------------------------- Grading WXP products according to their detection performance: ============================================================== Under the scope of VTCs grading system (see 4), we summarize our results for WXP-related scanners: ******************************************************************* In aVTC test "2003-04", we found *** NO perfect W-XP AV product *** and we found *** No perfect W-XP AM product *** ******************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------------ WXP file ITW test: AVP,DRW,FSE,NAV,SCN INO,RAV WXP macro ITW test: AVP,DRW,FSE,INO,NAV,SCN BDF,CMD,NVC,RAV WXP script ITW test: AVP,CMD,DRW, INO FSE,NAV,NVC,RAV,SCN ------------------------------------------------------------------ WXP file zoo test --- AVP,FSE,SCN WXP macro zoo test: SCN AVP,FSE,CMD,INO, RAV,NAV,NVC,DRW,BDF WXP script zoo test: --- SCN,FSE ------------------------------------------------------------------ WXP file pack test: AVP,FSE,SCN DRW WXP macro pack test: SCN AVP,DRW ------------------------------------------------------------------ WXP file FP avoidance: AVP,BDF,CMD,FSE,INO, DRW NAV,NVC,RAV,SCN WXP macro FP avoidance: BDF,INO,NAV,SCN RAV ------------------------------------------------------------------ WXP file malware test: --- FSE,AVP,SCN WXP macro malware test: AVP,FSE,SCN CMD,RAV,NVC,INO, NAV,BDF,DRW WXP script malware test: --- SCN,FSE,AVP,NAV ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this WXP test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" Windows-XP AntiVirus product: =NONE= (20 points) "Excellent" Windows-XP products: 1st place: SCN (18 points) 2nd place: AVP,FSE (13 points) 4th place: NAV (11 points) 5th place: DRW (10 points) 6th place: INO ( 9 points) 7th place: RAV ( 8 points) 8th place: BDF,CMD,NVC ( 6 points) ************************************************************ "Perfect" Windows-XP AntiMalware product:=NONE= (26 points) "Excellent" Windows-XP AntiMalware product: 1st place: SCN (22 points) 2nd place: AVP,FSE (17 points) 4th place: NAV (13 points) 5th place: DRW (11 points) 6th place: INO (10 points) 7th place: RAV ( 9 points) 8th place: BDF,CMD,NVC ( 7 points) ************************************************************ 9. Comparison of detection results under Windows-32 platforms: ============================================================== In this part, detection results of AV products tested under W-98 and W-2000 (in test "2002-12") and the related products tested under W-XP (in test "2003-04") are compared. For details see 7evalw32.txt. With the fast deployment of new versions of Microsoft Windows-32 (in past 5 years from W-NT to W-95, W-98, W-2000 and soon W-XP), both customers needing protection nd producers of security-enhancing software (esp. AntiVirus and AntiMalware) can only cope with the pace when they essentially re-use engines prepared for previous W32 platforms and simply "adapt" them to the intrinsics of the new platforms. Otherwise, "rewriting" the resp. software would consume too much time and efforts, and customers would receive "adapted" products only with some delay. AV/AM testers cannot determine the characteristics of the algorithms in scanning engines, either in following legal objectives (which, in most Copyright laws, prohibit reverse-engineering of proprietory code, except for specific reasons such as collecting evidence for a court case or teaching related techniques, as in Hamburg university IT Security curriculum), or for shere complexity of related code (and in many cases, for unsufficient professional knowledge of testers). It is therefore worthwhile to analyse whether those AV/AM products versions of which are available for all W32 platforms behave EQUALLY concerning detection and identification of viral and malicious code. Test Hypothesis: "W32-harmonical" behaviour of W32 products: ============================================================ We assume that those products which participate for all W32 platforms in this test (W98 and W2k) for ALL categories shall yield INDENTICAL results (argument for this assumption: likelihood of reuse of engines running on the same platform). We call product behaviour following this hypothesis "W32-harmonical". In comparing all three W32 platforms, macro viruses/malware are better handled than script viruses/malware but file viruses/malware detection is still rather different between platforms: this test last test -----------+----------- Equal detection of zoo file viruses: 6 (of 10) ----- of zoo infected files: 5 (of 10) ----- of ITW file viruses: ALL (of 10) ----- of ITW infected macro files: ALL (of 10) ----- of zoo file malware: 4 (of 10) ----- this test last test -----------+----------- Equal detection of zoo macro viruses: 8 (of 10) ----- of zoo infected macro objects: 8 (of 10) ----- of ITW macro viruses: ALL (of 10) ----- of ITW infected macro files: ALL (of 10) ----- of zoo macro malware: 7 (of 10) ----- this test last test -----------+----------- Equal detection of zoo script viruses: 8 (of 10) ----- of zoo script viral objects: 7 (of 10) ----- of ITW script viruses: ALL (of 10) ----- of ITW script viral objects: ALL (of 10) ----- of ITW script malware: 8 (of 10) ----- -------------------------------------------------------------- Concerning detection of FILE viruses (in all objects), 5 (of 10) products behave "W32-harmonically" in all categories: AVP,BDF,CMD,INO,SCN And concerning file malware detection, only 4 (of 10) products behave in W32-harmonical form: AVP,CMD,RAV,SCN -------------------------------------------------------------- Concerning detection of MACRO viruses (in all objects), a MAJORITY of 8 (of 10) products behave "W32-harmonically" in all categories: AVP,CMD,DRW,FSE,INO,NVC,RAV,SCN And concerning macro malware detection, 7 (of 10) products behave in W32-harmonical form: AVP,BDF,CMD,DRW,FSE,INO,SCN -------------------------------------------------------------- Concerning detection of SCRIPT viruses (in all objects), a MAJORITY of 7 (of 10) products behave "W32-harmonically" in all categories: AVP,CMD,DRW,INO,NVC,RAV,SCN And concerning script malware detection, 8 (of 10) products behave in W32-harmonical form: AVP,CMD,DRW,FSE,INO,NVC,RAV,SCN -------------------------------------------------------------- The following grid is used to grade W32 products concerning their ability for IDENTICAL detection for ALL categories on ALL W32 platforms: A "perfect" W32-harmonical AV product will yield IDENTICAL results for all categories (macro and script viruses). (Assigned value: 5). A "perfect" W32-harmonical AM product will be a perfect AV product and yield IDENTICAL results for all categories (macro and script malware). (Assigned value: 2). Grading W32-harmonical AntiVirus products: =========================================================== Grade: "Perfect" W32-harmonical detection: AVP,CMD,SCN =========================================================== Grading W32-harmonical AntiMalware products: =========================================================== Grade: "Perfect" W32-harmonical detection: AVP,CMD,SCN =========================================================== ************************************************************ "Perfect" W32-harmonical AntiVirus products: 1st place: AVP,CMD,SCN (5 points) ************************************************************ "Perfect" W32-harmonical AntiMalware products: 1st place: AVP,CMD,SCN (7 points) ************************************************************ 11. Final remark: In search of the "Perfect AV/AM product": =========================================================== Based on the aVTC evaluation scheme (as described in 3.), this is the result of aVTC test "2003-04": ********************************************************** In general, there is NO AntiVirus and NO AntiMalware product which can be rated "PERFECT" in all categories for ALL categories (file, macro AND script). ----------------------------------------------------------- But for SINGLE categories (file, macro OR script viruses), there are SEVERAL products which can be rated "perfect" or "excellent". ************************************************************ 14. Availability of full test results: ====================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. Next tests: Test HEUREKA-III will test the ability of products submitted for tests "2002-12" and "2003-04" to detect macro/script viruses/malware which were first reported in 2 consecutive 3-month periods AFTER product delivery (plan: to be published in June 2003) Next comparative test "2003-09" will test detection of file, macro and script viruses/malware. Products have been submitted in February 2003, testbeds have been frozen on December 31, 2002. A special test performed by 4 members of the test team will test the ability of AV products to detect viruses packed with more than 30 packers, partially recursively. The VTC Test Team: ------------------ VTC supervisor: Jan Seedorf Coordination: Klaus Brunnstein, Michel Messerschmidt, Jan Seedorf Testbeds: Klaus Brunnstein, Martin Kittel Hardware (C/S): Martin Kittel System Software: Martin Kittel, Jan Seedorf Managing Products: Klaus Brunnstein DOS-Scanners: Heiko Fangmeier WXP Scanners: Ulrike Siekierski, Stefan Heimann, Eugen Hofmann, Maxim Mariach, Jan Menne, Fabian Mueller, Jan Seedorf Evaluation, QA: Michel Messerschmidt, Thomas Buck Graphical Presentation: Thomas Buck, Jan Seedorf Test report: Klaus Brunnstein (supported by all) On behalf of the VTC test team: Dr. Klaus Brunnstein (April 30,2003) 15. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 2003 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Dr. Klaus Brunnstein Professor for Applications of Informatics University of Hamburg, Germany (April 30,2003)