============================================================ File 0XECSUM.TXT: "EXECUTIVE SUMMARY" VTC University of Hamburg AntiMalware Product Test "1998-10" ============================================================ Formatted with non-proportional font (Courier) Content of this text: ===================== 1. Background of this test 2. VTC testbeds used in VTC test "1998-10" 3. Summary #1: Development of DOS scanner detection rates Result #1 4. Summary #2: Performance of DOS scanners on ZOO testbeds Result #2 5. Summary #3: Performance of DOS scanners on ITW testbeds Result #3 6. Summary #4: Performance of DOS scanners by virus classes Result #4 7. Summary #5: Detection of viruses in compressed objects (DOS) Result #5 8. Summary #6: False Positive avoidance (DOS/Win-NT) Result #6 9. Summary #7: Detection of File/Macro Malware (DOS/Win-NT) Result #7 10. Summary #8: File/Macro virus detection under Win-NT Result #8 11. Summary #9: File/Macro Virus detection under 32-bit engines Result #9 12. Summary#10: Malware detection under Windows 95/NT 13. Conclusion: Searching the "Perfect AV/AM product" 14. Availability of full test results 15. Disclaimer Tables: ======= Table ES1: Content of VTC test databases Table ES2: List of AV products in Test "1998-10" Table ES3: Development of DOS scanners from 1997-02 to 1998-10 Table ES4: Development of Win-NT scanners since 1997-07 Table ES5: Malware detection under Windows-95/Windows-NT 1. Background of this test: =========================== Malicious software (malware) including viruses (= self-replicating malware), trojan horses (= pure payload without self-replication), virus droppers, and network malware (e.g. worms and hostile applets), is regarded as serious threat to PC users. With deployment of more than 4,000 viruses and several 100 Trojan horses annually, many of which are available from Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware (AM) and esp. AntiVirus (AV) software to detect and eradicate malicious software. Hence, the quality of AntiMalware products becomes an essential means of protecting customer produc- tivity and data. Virus Test Center (VTC) at Hamburg University`s Faculty for Informatics performs regular tests of AntiMalware and AntiVirus software. VTC recently tested actual versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their April 30, 1998 status to give AV producers a fair chance to support updates within the 6 weeks test period. Test goal was to determine detection rates, reliability (= consistency) of malware identification and reliability of detection of submitted or publicly available scanners. It was also tested whether viruses compressed with four popular compressing methods would be detected to what degree by scanners. As new test category, avoidance of False Positive alarms on "clean" (= non-viral and non-malicious) objects was also determined. VTC maintains collections of boot, file and macro viruses as well as related malware ("zoo"). Moreover, following the monthly list of "In-The-Wild Viruses" (published by Wildlist.org), a separate collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list does not report ITW Malware. 2. VTC testbeds used in VTC test "1998-10": =========================================== Table ES1: Content of VTC test databases: ================================================================= "Full Zoo":13,993 File Viruses in 112,038 infected files, 3,321 different File Malware in 7,989 file objects 881 System Viruses in 4,804 infected images, 2,159 Macro Viruses in 9,033 infected documents, 111 different Macro Malware in 191 macro objects 3,300 clean files used for False Positive test ----------------------------------------------------- "ITW Zoo": 122 File Viruses in 3,591 infected files, 207 System Viruses in 1,366 infected images, and 75 Macro Viruses in 710 infected documents 325 clean macro objects used for False Positive test ================================================================== (For detailed index of VTC testbeds, see file "a3testbed.zip") For test "1998-10", the following AntiVirus products (abbreveation, manufacturer) under DOS/Windows, Windows-95, Windows-98 and Windows-NT were tested: Table ES2: List of AV products in Test "1998-10": ================================================= ACCURA (ALE, Livingstone), AVAST! (AVS, Alwil), AVG (AVG, Grisoft), AVP (AVP, KAMI Ltd), AntiVir (AVK, H+B EDV), Anyware (ANT, Anyware), DrWeb (DRW, Dialogue Science), DSAV (DSS, Dr. Solomon), F-Prot and F-MacroW (FPR+FMA, Frisk Software), F-SECURE (FSE, Data Fellows), F/Win (FWN, Kurtzhals), HMVS (HMV, Valky,Vrtik), IBM AV (IBM, IBM), Inoculan (INO, CAI/Cheyenne), IRIS (IRS, IRIS Israel), InVircible (IVB), Integrity Master (ITM, Stiller Research), Norman Virus Control (NVC, Norman Data), Norton AV (NAV, Symantec), Power Antivirus (PAV, G-Data), RAV (RAV, GeCAD Romania), Scan (SCN, McAfee), Sweep (SWP, Sophos), TBAV (TBA, ThunderByte), TNT (TNT, Carmel), TSCAN (TSC, Marx), VBS (VBS, Virus Buster), VET (VET, CYBEC) and VSP (VSP, VirScan Plus). AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few more scanners were withdrawn from VTC tests in general or for this test, some of which were announced to participate in some future test. Finally, very few AV producers answered VTCs bids for submitting scanners with electronic silence. Concerning malware detection, some AV producers asked that their product is NOT tested in this category. While VTC regards malware as essential threat to users and therefore suggests including malware detection in ANY AV product, it followed such request; therefore, some AV products are missing in malware detection tables. The following text surveys essential findings in comparison with last VTC tests (performance over time), as well as some relative "grading" of scanners for detection of file and macro viruses as well as related malware, uncompressed both in the full "zoo" and "In-The-Wild", as well detection of file and macro viruses in objects compressed with ARJ, LHA, ZIP and RAR. Finally, the ability of AV products to avoid False Positive alarms was included. Detailed results including precision and reliability of virus identification as well as results for boot/MBR infectors are described in overview tables "6ASUMOV.TXT" and the related tables for DOS/boot+file+macro, Win-95, Win-98 and Win-NT detection. Moreover: for evaluation of test results, see 7EVAL.TXT, and for comparison of performance data, see 6ASUMOV.TXT. 3. Summary #1: Development of DOS scanner detection rates: ========================================================== Concerning performance of DOS scanners, a comparison of virus detection results in tests from "1997-02" until new test "1998-10" shows how scanners behave and how manufacturers work for adapting their products to the growing threat of new viruses. The following table lists the development of detection rates of scanners (most actual versions in each test), and it calculates changes (+ indicating improvement) in detection rates. For reasons of fairness, it must be noted that improvement of those products which have yet reached a very high level of detection and quality (say: more than 90 or 95%) is much more difficult to achieve than for those products which reach lower detection rates. Moreover, changes in the order of about +-2% are not significant as this is about the growth rate per month, so detection depends strongly whether some virus is reported (and analysed and included) just before a new update is delivered. The following table lists developments for detection of file and macro viruses; for details as well as for boot virus detection, see result tables 6b to 6f in full report. Table ES3: Improvement of DOS scanners from 1997-02 to 1998-10: ============================================================== ---- File Virus Detection --- ---- Macro Virus Detection --- SCAN 97/02 97/07 98/02 98/10 DELTA 97/02 97/07 98/02 98/10 DELTA NER % % % % % % % % % % ----------------------------------------------------------------- ALE 98.8 94.1 89.4 - - 96.5 66.0 49.8 - - AVS 98.9 97.4 97.4 97.9 +0.5 99.3 98.2 80.4 97.2 +16.8 AVG 79.2 85.3 84.9 87.6 +2.7 25.2 71.0 27.1 81.6 +54.6 AVK - - - 90.0 - - - - 99.7 - AVP 98.5 98.4 99.3 99.7 +0.4 99.3 99.0 99.9 100.0 +0.1 ANT 73.4 80.6 84.6 75.7 -8.3 58.0 68.6 80.4 56.6 -23,8 DRW 93.2 93.8 92.8 93.1 +0.3 90.2 98.1 94.3 99.3 +5.0 DSS 99.7 99.6 99.9 99.9 0.0 97.9 98.9 100.0 100.0 +0.0 FMA - - - - - 98.6 98.2 99.9 - - FPR 90.7 89.0 96.0 95.5 -0.5 43.4 36.1 99.9 99.8 -0.1 FSE - - 99.4 99.7 - - - 99.9 90.1 -9.8 FWN - - - - - 97.2 96.4 91.0 85.7 -5.3 IBM 93.6 95.2 96.5 - - 65.0 88.8 99.6 - - INO - - 92.0 93.5 +1.5 - - 90.3 95.2 +4.9 IRS - 81.4 74.2 - - - 69.5 48.2 -22.3 ITM - 81.0 81.2 65.8 -15.4 81.8 58.2 68.6 76.3 +7.7 IVB 8.3 - - - - - - - - HMV - - - - - - - 98.2 99.0 +0.8 NAV 66.9 67.1 97.1 98.1 +1.0 80.7 86.4 98.7 99.8 +1.1 NVC 87.4 89.7 94.1 93.8 -0.3 13.3 96.6 99.2 90.8 -8.4 PAN - - 67.8 - - - - 73.0 - - PAV - 96.6 98.8 - - - 93.7 100.0 - - PCC - - - - - - 67.6 - - - PCV 67.9 - - - - - - - - - RAV - - - 71.0 - - - - 99.5 - SCN 83.9 93.5 90.7 87.8 -0.9 95.1 97.6 99.0 98.6 -1.4 SWP 95.9 94.5 96.8 98.4 +1.6 87.4 89.1 98.4 98.6 -0.2 TBA 95.5 93.7 92.1 93.2 +1.1 72.0 96.1 99.5 98.7 -1.2 TNT 58.0 - - - - - - - - - TSC - - 50.4 56.1 +5.7 - - 81.9 17.0 -64.9 VBS 43.1 56.6 - 35.5 - - - - - - VDS - 44.0 37.1 - - 16.1 9.9 8.7 - - VET - 64.9 - - - - 94.0 97.3 97.5 +0.2 VRX - - - - - - - - - - VHU 19.3 - - - - - - - - - VSA - - 56.9 - - - - 80.6 - - VSP - - - 76.1 - - - - - - VSW - - 56.9 - - - - 83.0 - - VTR 45.5 - - - - 6.3 - - - - XSC 59.5 - - - - - - - - - ------------------------------------------------------------------ Mean 74.2 84.8 84.4 85.4 69.6 80.9 83.8 89.6 ------------------------------------------------------------------ ************************************************************** Result #1.1) In comparison with last VTC test ("1998-02"), the ability of scanners to detect file viruses under DOS was only slightly improved (mean value up 1% to 85.4%). Equally only 6 out of 20 scanners in test detected 100% of ITW file viruses. #1.2) On the better side the ability of scanners to detect macro viruses improved significantly (by almost 6%) to now 89.6 (mean detection rate). Now 2 scanners detect ALL Zoo viruses, and 17 (out of 21) scanners detect ALL ITW macro viruses. This indicates that contemporary macro viruses are not technically difficult to process. #1.3) Evidently most AV producers invest relatively more work into detection of macro viruses than of file viruses. There is a risk that threats of file viruses are underestimated! ************************************************************** 4. Summmary #2: Performance of DOS scanners on zoo testbeds: ============================================================ Concerning rating of DOS scanners, the following grid is applied to classify scanners: - detection rate =100.0% : scanner is graded "perfect" - detection rate above 95% : scanner is graded "excellent" - detection rate above 90% : scanner is graded "very good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall grade" (including file and macro virus detection), the lowest of the related results is used to classify resp. scanners. If several scanners of the same producer has been tested, grading is applied to the most actual version (which is, in most cases, the version with highest detection rates). Only scanners where all tests were completed are considered; here, the most actual version with test completed was selected. The following list indicates those scanners graded into one of the upper two categories, with file and macro virus detection rates in uncompressed form, and with perfect ITW virus detection (rate=100%): The following list indicates those scanners graded into one of the upper three categories, with file and macro virus detection rates in uncompressed form, and with perfect ITW virus detection (rate=100%): (file/macro zoo; file/macro ITW) -------------------------------- "Perfect" DOS scanners : =NONE= "Excellent" DOS scanners: DSS ( 99.9% 100.0%; 100.0% 100.0%) AVP ( 99.7% 100.0%; 100.0% 100.0%) SWP ( 98.4% 98.6%; 100.0% 100.0%) "Very Good" DOS scanners: AVS ( 97.9% 97.2%; 99.2% 100.0%) FPR ( 95.5% 99.8%; 99.2% 100.0%) INO ( 93,5% 95.2%; 99.2% 100.0%) NVC ( 93.8% 90.8%; 100.0% 100.0%) TBA ( 93.2% 98.7%; 99.2% 100.0%) DRW ( 93.1% 99.3%; 99.2% 100.0%) FSE ( 99.7% 90.1%; 100.0% 100.0%) AVK ( 90.0% 99.7%; 100.0% 100.0%) ************************************************************** Result #2) The overall virus detection quality of DOS scanners has reached a very acceptable level also for viruses which are not "in-the-wild" but with a bias to better macro virus detection. ************************************************************** 5. Summary #3: Performance of DOS scanners on ITW testbeds: =========================================================== Concerning "In-The-Wild" viruses, a much more rigid grid must be applied to classify scanners, as the likelyhood is significant that a user may find such a virus on her/his machine. The following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" "Perfect" DOS ITW scanners: AVK (100.0% 100.0%) AVP (100.0% 100.0%) DSS (100.0% 100.0%) FSE (100.0% 100.0%) NVC (100.0% 100.0%) SWP (100.0% 100.0%) Several scanners miss the highest category for ITW-detection marginally; the following scanners are rated "excellent": "Excellent" DOS ITW scanners: AVG ( 99.2% 100.0%) AVS ( 99.2% 100.0%) DRW ( 99.2% 100.0%) FPR ( 99.2% 100.0%) INO ( 99.2% 100.0%) TBA ( 99.2% 100.0%) As macro-only products, VET and HMV also reach "perfect" 100% ITW detection. ************************************************************** Result #3) The In-The-Wild detection of best DOS scanners has been improved since last test, esp. concerning macro virus detection. ************************************************************** 6. Summary #4: Performance of DOS scanners by virus classes: ============================================================ Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or as that part is significantly better as other parts). It is therefore worth notifying which scanners perform best in detecting file, boot and macro viruses. Compared to last test, the number of "excellent" macro virus detectors has significantly grown (as has the class of "good" ones which is not listed here); in contrast, "standard" file viruses (and even more) boot viruses seem to be less attractive in product upgrading. With no product rated "perfect" (=100%), those products with grade "excellent" (>95%) are listed below. 6.1 Detection of file viruses: ------------------------------ "Very Good" DOS scanners: DSS ( 99.9%) AVP ( 99.7%) FSE ( 99.9%) SWP ( 98.4%) NAV ( 98.1%) AVS ( 97.9%) FPR ( 95.5%) 6.2 Detection of macro viruses: ------------------------------- "Perfect" DOS scanners: AVP (100.0%) DSS (100.0%) "Excellent" DOS scanners: FPR ( 99.8%) NAV ( 99.8%) AVK ( 99.7%) RAV ( 99.5%) DRW ( 99.3%) HMV ( 99.0%) TBA ( 98.7%) SCN ( 98.6%) SWP ( 98.6%) VET ( 97.5%) AVS ( 97.2%) INO ( 95.2%) ************************************************************** Result #4: Specialised scanners (esp. those specialising on macro viruses) are not superior to best overall scanners, even concerning large collections such as VTCs "zoo" testbeds. ************************************************************** 7. Summary #5: Detection of viruses in compressed objects under DOS: ================================================================ Since test "1998-02" VTC tests include testing detection of viruses in compressed objects. Now four popular compressing methods were selected: ARJ, LHA, ZIP and RAR. For every virus one infected object was compressed with each of these methods and detection was tested. Concerning results (see 6BDOSFIL.TXT and 6DDOSMAC.TXT), results are presently rather DISAPPOINTING: Only 9 scanners (out of 20) detected at least ONE file virus compressed with at least ONE compressing method. And only 8 scanners (out of 21) detected at least ONE macro virus compressed with at least ONE compressing method. The following table lists ALL scanners which detect file and macro viruses in objects compressed with AT LEAST TWO compressing methods: -- Compressed Files -- -- Compressed Macros -- %ZIP %LHA %ARJ %RAR %ZIP %LHA %ARJ %RAR -------------------------------------------------- ALL Methods: AVP (93.2 93.2 93.2 93.2) (100.0 99.9 100.0 100.0) FSE (93.2 93.2 93.2 93.2) (100.0 99.9 100.0 100.0) 3 Methods: DSS (99.5 99.5 99.5 0.0) (100.0 99.9 100.0 0.0) AVK (89.6 89.6 89.6 89.6) ( 99.7 99.7 99.7 99.7) 2 Methods: FPR (95.0 0.0 95.0 0.0) ( 99.0 0.0 99.0 0.0) NVC (93.1 0.0 93.1 0.0) ( 90.6 0.0 90.6 0.0) -------------------------------------------------- Remark: Much more data were collected on precision and reliability of virus detection in compressed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ************************************************************* Result #5) VERY FEW products have reached an acceptable level of detecting viruses in compressed infected objects with given compression methods. Signi- ficant investment of work is needed here. ************************************************************* 8. Summary #6: False Positive avoidance of DOS and Win-NT scanners: =================================================================== Regarding the ability of scanners to avoid False Positive (FP) alarms, the following AV products running under DOS reported NO SINGLE False Positive alarm both in file and macro zoo testbeds and are therefore rated "perfect": FP-avoiding "perfect" DOS scanners: AVP, DSS, NVC, RAV and SCN. Several more DOS scanners gave NO FP alarm on clean files or macros: Perfect FP-avoidance on DOS clean file testbed: AVP, AVK, DSS, FPR, FSE, NVC, RAV, SCN and SWP Perfect FP-avoidance on DOS clean macro file testbed: ANT, AVP, AVS, DSS, NVC, RAV, SCN and TBA Comparing related results with behaviour of 32-bit scanner engines and esp. using results produced under Win-NT, there is just ONE AV product which avoids ANY FP alarm in both clean file and macro objects: FP-avoiding "perfect" Win-NT scanner: DSS Several more scanners also gave NO FP alarm on clean files or macros: Perfect FP-avoidance on Win-NT clean file testbed: AVK, AVP, DSS, FSE, NVC, PAV, RAV, SCN and SWP Perfect FP-avoidance on Win-NT clean macro file testbed: AVS, DSS, FMA(FPR), FWN, IBM and TBA. Presently only ONE AV product avoids ANY False Positive alarm both for clean file and macro objects under both DOS and Win-NT: Overall perfect FP-avoiding scanner: DSS ************************************************************* Result #6.1) VERY FEW products reliably avoid ANY False Positive alarm on clean file and macro objects, both under DOS and Win-NT. There are several products which have excellent FP avoidance in one category (either clean file or macro objects). #6.2) Only ONE product - DSS - did not give ANY FP alarm for both categories (file, macro objects) under BOTH DOS and Win-NT. #6.3) AV producers should intensify work to avoid FP alarms. ************************************************************** 9. Summary #7: Detection of File and Macro Malware (DOS/Win-NT): ================================================================ In comparison with last VTC test (where no product was rated as "excellent" in this category) under DOS, there are now several scanners detecting over 90% of malicious non-viral software in VTCs (partial) malware testbeds. As this applies to both DOS and Win-NT, the following table lists scanners with related performance under BOTH operating systems: ===== Malware Detection ====== = under DOS = = under Win-NT = (File/Macro-mw; File/Macro-mw) ------------------------------ "Excellent" DOS scanners: DSS (98.4% 100.0%; 98.4% 100.0%) AVP (94.5% 96.4%; 94.5% 96.4%) FSE (94.5% 96.4%; 93.5% 100.0%) AVK (93.1% 96.4%; 93.1% 96.4%) Moreover, the following scanners reach 90%-detection either for file or macro malware for at least one operating system (DOS or Win-NT): "Excellent" scanners in at least one category/under one OS: ACU, AVK, AVP, AVS, DRW, DSS, FSE, FPR(FMA), FWN, INO, PAV, RAV and TBA. ************************************************************** Result #7.1: Obviously several AV producers have worked very hard to protect their customers also from non-viral malware. 4 scanners reached a con- siderable level (>90%) for file and macro malware detection under both DOS and Win-NT. #7.2: With continuous growth of malware testbeds, AV producers are well advised to improve their products also in this area. ************************************************************** 10. Summary #8: File/Macro virus detection under Windows-NT: ============================================================ Table ES4: Development of Win-NT scanners since 1997-07: --------------------------------------------------------------- Scan- === File Virus Detection === === Macro Virus Detection == ner 97/07 98/02 98/10 Delta 97/07 98/02 98/10 Delta --------------------------------------------------------------- ANT 88.9 69.2 91.3 +22.1 92.2 - 85.7 - ANY - - 69.7 - - - 70.5 - AVK - - 99.6 - - - 99.6 - AVP - - 83.7 - - - 100.0 - AVS - 97.4 96.6 -0.8 - 91.9 97.2 +5.3 AW - 56.4 - - - - 61.0 - DSS 99.6 99.7 99.9 +0.2 99.0 100.0 100.0 +0.0 FPR(FMA) - 96.1 - - - 99.9 99.8 -0.1 FSE - 85.3 - 99.8 - 99.9 - - FWN - - - - - - 99.6 - HMV - - - - - - 99.0 - IBM 95.2 95.2 77.2 -18.0 92.9 92.6 98.6 +6.0 INO - 92.8 - - - 89.7 - - IRS - 96.3 - - - 99.1 - - IVB - - - - - - 92.8 - NAV 86.5 97.1 - - 95.6 98.7 99.9 +1.2 NVC 89.6 93.8 93.6 -0.2 96.6 99.2 - - PAV 97.7 98.7 98.4 -0.3 93.5 98.8 99.5 +0.7 RAV - 81.6 84.9 +3.3 - 98.9 99.5 +0.6 PCC 63.1 - - - - 94.8 - - PER - - - - - 91.0 - - SCN 94.2 91.6 71.4 -20.2 97.6 99.1 97.7 -1.4 SWP 94.5 96.8 98.4 +1.6 89.1 98.4 97.5 -1.1 TBA - 93.8 92.6 -1.2 96.1 - 98.7 - TNT - - - - - - 44.4 - VET 64.9 - - - - 94.0 - - VSA - 56.7 - - - 84.4 - - --------------------------------------------------------------- Mean: 87.4% 88.1% 89.0% - 94,7% 95,9% 91.6% --------------------------------------------------------------- Generally, the ability of Win-NT scanners to detect file viruses "on the average" has slightly improved (+0,9%) in comparison with last test. At the same time, the "mean" detection of macro viruses has relatively decreased (different from DOS scanners!) at rate of 4.3%. Indeed several scanners performing well in VTC test "1998-02" have now yielded lesser macro virus detection rates. Btw: The same tendency is also observable with Win-95 scanners. "Excellent" Windows-NT scanners: DSS (99.9% 100.0%) AVK (99.6% 99.6%) PAV (98.4% 99.5%) SWP (98.4% 97.5%) AVS (96.6% 97.6%) Moreover, besides those results mentioned above, FSE (99.8%) reached "excellent" performance level on file virus detection. The following scanners reached "excellent" performance level on macro virus detection under Windows-NT (in addition to those mentioned above): AVP (100.0%), NAV (99.9%), FPR/FMA (99.8%), FWN (99.6%), RAV (99.5%), HMV (99.0%), TBA (98.7%), IBM (98.6%) and SCN (97.7%) ************************************************************* Result #8.1: With a growing number of scanners working under Windows-NT some have reached the level of "excellency" (although not yet "perfect"), both in detecting file and macro viruses. Some more scanners have also excellent results either in detection of file or of macro viruses. Result #8.2: Generally the "average" detection rate of Win-NT scanners for file viruses dropped slightly, but it dropped seriously for macro viruses. Concerning macro virus detection, Win-NT scan- ners behave "on the average" significantly less favourable compared to DOS scanners. ************************************************************** 11. Summary #9: File/Macro Virus detection under 32-bit engines: ================================================================ Concerning 32-Bit engines as used in Windows 95/98 and Windows NT, it is interesting to test the validity of the hypothesis that related engines produce the same detection and identification quality. For details see 6HCOMP32.TXT. When comparing results from related tests (as far as applicable), it is interesting to observe that identical results are presently achieved only for very few (6 out of 19) products of equal quality for zoo file viruses, whereas few more (11 out of 19) equally well detect ITW file viruses under all 32-Bit engines. Concerning macro viruses results are more favourable, as 18 (out of 24) products detect ITW macro viruses equally well whereas as few as 6 (out of 24) products detect zoo macro viruses with equal quality. Indeed only 2 products were found to perfectly equally detect both in zoo and ITW file and macro virus testbeds: AVP and TBAV. Few scanners achieve results which are not too far apart from perfect quality. ************************************************************* Result #9: The assumption that 32-bit engines in scanners produce the same detection rate for different instantiations of 32-bit operating systems (esp. concerning Windows-95, Windows-98 and Windows-NT) holds only for 2 scanners (AVP and TBA) and is NOT correct in general, at least presently. ************************************************************** 12. Summary #10: Malware detection under Windows 95/NT: ======================================================= As Windows 95 and Windows-NT are often used for downloading potentially hazardous objects from Internet, it is interesting to measure the ability of AntiVirus products to also act as AntiMalware products. The following table lists those products which reached a "very good" (>90%) detection threshhold for both categories of file and macro malware, both under Windows 95 and Windows NT: Table ES5: Malware detection under Windows-95/Windows-NT ======================================================== Detection of File Malware Detection of Macro Malware Scanner Win95 WinNT Win95 WinNT ---------------------------------------------------------------------- AVK 93.1% 93.1% 96.4% 96.6% AVP 94.5% 94.5% 96.4% 96.4% DSS 98.0% 98.4% 100.0% 100.0% FSE 93.6% 93.5% 100.0% 100.0% PAV 91.6% 91.6% 96.4% 96.4% --------------------------------------------------------------------- Moreover the following additional products also reached a "very good" level of macro malware detection under both Win-95 and Win-NT: FPR/FMA (98.2% 98.2%), FWN (96.4% 96.4%), AVS (93.7% 93.7%) and TBA (91.9% 91.0%) Evidently some AV products are able to help protecting users by detecting file and macro-related malware at some relevant level. Fortunately related products also show good to excellent results in Detecting viral malware. ************************************************************** Result #10: Some AntiMalware producers help customers detecting also non-viral malware under 32-bit operating systems, esp. Win-95 and Win-NT. But most AV-products are far from supporting their customers against related threats. ************************************************************** 13. Final remark: Searching the "Perfect AV/AM product" ===================================================== Under the scope of VTCs grading system, an "ideal AV product" would have the following characteristics: Definition: A "Perfect AV product" ---------------------------------- 1) will detect ALL viral samples (whether zoo or In-The-Wild) in ALL categories (file, boot and script-based) with always the same identification precision and in every infected sample, 2) will detect ALL viral samples in every object, whether uncompressed or compressed with at least all popular scanners, and 3) will NEVER issue a False Positive alarm on any sample which is not viral. Definition: A "Perfect AntiMalware product" ------------------------------------------- 1) will be an "ideal AntiVirus product", and 2) it will also detect all essential forms of malicious software, whether in compressed or uncompressed form, reliably. In VTC test "1998-10" we detected NO "ideal AV product" but some products come rather close to this ideal. Some more work must be invested by AV producers to approach the "ideal product" esp. for detection of compressed viral objects and under 32-bit engines. Moreover some AV products are also capable of protecting Malware but much more improvement is needed. 14. Availability of full test results: ====================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. The next comparative test is planned for December 1998 until February 1999, with viral databases frozen on October 30, 1998. Any AV producer wishing to participate in forthcoming test is invited to submit related products. On behalf of the VTC Test Crew: Dr. Klaus Brunnstein (November 28,1998) 15. Disclaimer: =============== Copyright, License, and Disclaimer. This publication is (C) Copyright 1998 by Klaus Brunnstein and the Virus Test Center at the University of Hamburg. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information is not changed in any way, and that the origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMS or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to the publication. No responsibility is assumed by the publisher for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or idea contained in the material herein. Prof. Dr. Klaus Brunnstein, Hamburg University, Germany