================================================ File 7EVALLIN.TXT: ------------------------------------------------ Evaluation of results for File, Macro and Script Virus/Malware detection under LINUX in VTC Test "2002-12": ================================================ Formatted with non-proportional font (Courier) Content of this file: ===================== Eval A: Development of detection rates under Linux: ********************************************************************** Eval LIN.01: Development of Linux Scanner Detection Rates Table LIN-A: File/Macro/Script Virus Detection Rate in last 3 VTC tests Eval LIN.02: In-The-Wild Detection under Linux Eval LIN.03: Evaluation of overall Linux AV detection rates Eval LIN.04: Evaluation of detection by virus classes under Linux LIN.04.1 Grading the Detection of file viruses LIN.04.2 Grading the Detection of macro viruses LIN.04.6 Grading the Detection of script viruses Eval LIN.05: Detection of Packed Macro Viruses under Linux Eval LIN.06: Avoidance of False Alarms (File, Macro) under Linux Eval LIN.07: Detection of File, Macro and Script Malware under Linux Eval LIN.SUM Grading of LINUX products ********************************************************************** This part of VTC "2002-12" test report evaluates the detailed results as given in sections (files): 6iLINUX.TXT File/Macro/Script Viruses/Malware results LINUX The following *8* products participated in this scanner test for Linux products: -------------------------------------------------------- Products submitted for aVTC test under Linux (SuSe): -------------------------------------------------------- AVK v(def): 3.0 beta 1.1 Sig/date: December 7, 2001 CMD v(def): 4.64.1 SIGN.DEF date: December 17, 2001 SIGN2.DEF date: December 17, 2001 MACRO.DEF date: December 16, 2001 DRW v(def): Dr.Web for Linux, version 4.26 date: September 22, 2001 Sig/date: December 17, 2001 FPR v(def): --- FSE v(def): Release 4.13 build 3360 Eng: F-PROT 3.10 build 701 sign.def date: December 13, 2001 sign2.def date: December 13, 2001 fsmacro.def date: December 13, 2001 OAV v(def): 0.2.0 date: December 12, 2001 Sig: 2001.12.20.22.33 Sig/date: December 20, 2001 RAV v(def): 8.3.1 Eng: 8.5 for i386 Sig/date: December 17, 2001 SCN v(def): 4.16.0 Sig: 4177 date: December 17, 2001 -------------------------------------------------------- Remark: in one case, definition and signature dates could not be derived from installed product. Eval LIN.01: Development of Linux Scanner Detection Rates: ==================================================== Linux-based scanners were tested in "2002-12" for the 3rd time, this time again (as in 1st test) for file, macro and script malware detection. This time, 8 scanners were available for tests, 1 less than in last test. Table E-Lin.1: Performance of LINUX scanners in Test 2001-04 until 2002-12: =========================================================================== Scan I == File Virus == + ===== Macro Virus ===== + === Script Virus ==== ner I Detection I Detection I Detection -----+------------------+-------------------------+------------------------ Test I 0104 0212 Delta I 0104 0110 0212 Delta I 0104 0110 0212 Delta -----+------------------+-------- ----------------+------------------------ ANT I - - - I - 97.1 - - I - 81.8 - - AVK I - 99.9 - I - 100% 100~ 0.0 I - 100% 99.1 -0.9 AVP I 99.9 - - I 100~ 100% - - I 99.8 100% - CMD I 97.8 99.1 +1.3 I 100% 100% 100~ ~0.0 I 96.9 94.2 89.4 -4.8 DRW I - 98.3 - I - 99.5 99.4 -0.1 I - 95.4 94.7 -0.7 FPR I - 98.9 - I - - 100~ - I - - 88.7 - FSE I 97.1 98.0 +0.9 I 100~ 100% 100~ ~0.0 I 96.9 92.3 88.1 -4.2 MCV I - - - I - 9.1 - - I - 27.6 - - OAV I - 9.1 - I - - 0.1 - I - - 13.9 - RAV I 93.5 96.7 +3.2 I 99.6 99.5 99.9 +0.4 I 84.9 82.5 96.1 +13.6 SCN I 99.7 99.8 +0.1 I 100% 100% 100% 0.0 I 99.8 99.8 99.5 -0.3 -----+-------------------+------------------------+------------------------ Mean : 97.6% 87.5% +1.4% I 99.9% 89.5% 74.9% +0.1 I 95.7% 86.0% 83.7% +0.5% >10% : 97.6% 98.7% - I 99.9% 99.5% 99.8% - I 95.7% 86.0% 83.7% - -----+-------------------+------------------------+------------------------ Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Evaluation: Zoo file virus detection has slightly improved, if one doesnot count a new product which has an inacceptably low detection rate. Zoo macro virus detection is stable on a very high level (again, the new product performs inacceptably bad). Zoo script detection is stable but on an insufficient level. ******************************************************************** Findings LIN.01: Concerning zoo virus detection, LINUX products are less developped as compared to other platforms. Detection rates for file viruses (in zoo: 98.7%) are significantly better than for macro viruses (in zoo: 89.3%) and for script viruses (in zoo: 83.7%). NO product detects ALL zoo file viruses, but 3 products detect >99% (grade: excellent) AVK (99.9%), SCN (99.8%), CMD (99.1%) ONE product detects ALL zoo macro viruses: SCN (100.0%): grade: perfect, and products detect almost 100%: AVK, CMD, FPR, FSE (all: 100~), RAV (99.9%): grade: "excellent" NO product detects ALL zoo script viruses, but 2 products detect >90% (grade: excellent): SCN (99.9%), AVK (99.1%) ******************************************************************* Eval LIN.02: In-The-Wild (File,Macro,Script) Detection under LINUX ================================================================== Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for macro and script viruses (it must be observed that detection and identification is not completely reliable). Presently, 3 (out of 8) products are "perfect" as they detect ALL viruses and ALL infected objects in ALL classes. ITW virus/file detection ( FileV. MacroV. ScriptV. ) ---------------------------------- "perfect" LINUX ITW scanner: AVK (100% 100%; 100% 100%; 100% 100%) DRW (100% 100%; 100% 100%; 100% 100%) SCN (100% 100%; 100% 100%; 100% 100%) "Excellent" LINUX ITW scanners: ----- NONE ----- ---------------------------------- Concerning detection of ITW file viruses, 3 scanners are "perfect": AVK, DRW, SCN: all (100% 100%) And 1 product (out of 8) detects ALL file viruses and almost all samples and is "excellent": RAV (100% 99.1%) Detection of ITW macro viruses is much better developped than detection of file viruses and infected objects: 3 scanners are rated "perfect": AVK (100% 100%), DRW (100% 100%), SCN (100% 100%) 4 scanners are rated "excellent": CMD, FPR, FSE: all (100% 99.9%), and RAV (100% 99.8%) Detection of ITW script viruses is very good, as ALMOST ALL products (7 out of 8) detected all viruses and infected objects: 7 scanners are rated "perfect": AVK, CMD, DRW, FPR, FSE, RAV, SCN: all (100% 100%) ******************************************************************* Findings LIN.2: 3 AV products detect "perfectly" all ITW file, macro and script viruses in all files: AVK,DRW,SCN ************************************************** Concerning detection of ITW file viruses: 3 "perfect" scanners: AVK,DRW,SCN 1 "excellent" scanner: RAV Concerning detection of ITW macro viruses: 3 "perfect" scanners: AVK,DRW,SCN 4 "excellent" scanners: CMD,FPR,FSE,RAV Concerning detection of ITW script viruses: 7 "perfect" scanners: AVK,CMD,DRW,FPR,FSE,RAV,SCN ******************************************************************* Eval LIN.03: Evaluation of overall LINUX AV detection rates (Zoo,ITW) ===================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file, macro and script virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Zoo test: ITW test: (file/macro/script; file/macro/script) -------------------------------------- "Excellent" LINUX scanners: AVK (100% 100% 99.1%; 100% 100% 100%) SCN (100% 100% 99.5%; 100% 100% 100%) ------------------------------------- Concerning insufficient detection (<50%), we have to assign the grade "useless" to OAV (=Open Anti Virus) wich performs TOTALLY INACCEPTABLE: ------------------------------------- "Useless" LINUX scanner: OAV ( 9.1% 0.1% 14.2%; 34.0% 0.0% 25.0%) ------------------------------------- ******************************************************************* Findings LIN.03: 3 LINUX products overall rated "excellent": AVK,SCN ----------------------------------------------------- 1 "Useless" overall LINUX scanner: OAV ******************************************************************* Eval LIN.04: Evaluation of detection by virus classes under LINUX: ================================================================== Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. LIN.04.1 Grading the Detection of file zoo viruses under LINUX: --------------------------------------------------------------- "Perfect" LINUX file scanners: === NONE === "Excellent" LINUX file scanners: AVK ( 99.9%) SCN ( 99.8%) CMD ( 99.1%) "Very Good" LINUX file scanners: FPR ( 98.9%) DRW ( 98.3%) FSE ( 98.0%) RAV ( 96.7%) LIN.04.2 Grading the Detection of macro zoo viruses under LINUX: ---------------------------------------------------------------- "Perfect" LINUX macro scanners: SCN (100.0%) "Excellent" LINUX macro scanners: AVK ( 100~ ) CMD ( 100~ ) FPR ( 100~ ) FSE ( 100~ ) RAV ( 99.9%) DRW ( 99.4%) "Very Good" LINUX macro scanners: === NONE === LIN.04.3 Grading the Detection of Script zoo viruses under LINUX: ----------------------------------------------------------------- "Perfect" LINUX script scanners: === NONE === "Excellent" LINUX script scanners: SCN ( 99.5%) AVK ( 99.1%) "Very Good" LINUX script scanners: RAV ( 96.1%) ******************************************************************** Finding LIN.04: Performance of LINUX scanners by virus classes: ---------------------------------------------------- Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVK,SCN,CMD Very Good scanners for file zoo: FPR,DRW,FSE,RAV Perfect scanners for macro zoo: SCN Excellent scanners for macro zoo: AVK,CMD,FPR,FSE,RAV,DRW Very Good scanners for macro zoo: --- Perfect scanners for script zoo: --- Excellent scanners for script zoo: SCN,AVK Very Good scanners for script zoo: RAV ******************************************************************** Eval LIN.05: Detection of Packed File and Macro Viruses under LINUX: ===================================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods are also detected. The following 6 packers were used in this tests: PKZIP, ARJ, LHA, RAR, WinRAR and CAB. ATTENTION: for packing objects in ITW testbeds, we used WinRAR 2.0. As WinRAR 2.0 didnot properly pack VTCs very large file testbed, this testbed was packed with WinRAR 2.9 which at that time was available in its final version (after longer availability of beta versions) since >3 months. Only upon evaluation, we detected that ONLY ONE product (RAV) was at all able to handle WinRAR 2.9 packed malware, at least to some degree (though not sufficient for the grade "perfect"). Consequently, this evaluation doesNOT include WinRAR. The following evaluation includes: ARJ,CAB,LHA,RAR,ZIP. A "perfect" product would detect ALL packed ITW file/macro viruses (100%) for 5 packers: ------------------------------------------------------- "Perfect" packed file/macro virus detectors: AVK,SCN ------------------------------------------------------- An "excellent" product would detect ALL packed ITW file/macro viruses (100%) for 4 packers: ------------------------------------------------------- "Excellent" packed file/macro virus detector: DRW,RAV ------------------------------------------------------- A "very good" product would detect ALL packed ITW file/macro viruses (100%) for 3 packers: -------------------------------------------------------- "Very Good" packed file/macro virus detector: --- -------------------------------------------------------- Some products which failed to detect all packed ITW file viruses were able to detect packed macro viruses at a significant level: "Perfect" packed macro virus detector (5 packers): AVK,CMD,FPR,SCN "Excellent" packed macro virus detector (4 packers): DRW,RAV "Very Good" packed macro virus detector (3 packers): FSE Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. *********************************************************************** Findings LIN.05: Detection of packed viral objects needs improvement: Perfect packed ITW file/macro virus LINUX detector: AVK,SCN Excellent packed ITW file/macro virus LINUX detector: DRW,RAV Very Good packed ITW file/macro virus LINUX detector: --- *********************************************************************** Eval LIN.06: Avoidance of False Alarms (File/Macro) under LINUX: ================================================================ First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, only 1 (of 8) product reported NO SINGLE False Positive alarm in file and macro zoo testbeds and is therefore rated "perfect": -------------------------------------------------------------- "Perfect" file AND macro-FP avoiding LINUX scanners: SCN -------------------------------------------------------------- In this category, products performed much better on avoiding false positive alarms on file viruses: -------------------------------------------------------------- "Perfect" file -FP avoiding LINUX scanners:AVK,FPR,FSE,RAV,SCN -------------------------------------------------------------- (In addition, OAV didnot report any false alarm, but with its inacceptably low detection rate this canNOT be regarded for grading). ******************************************************************** Findings LIN.06: Avoidance of False-Positive Alarms is insufficient and needs improvement. FP-avoiding perfect LINUX scanners: SCN *************************************************** Concerning file-FP avoidance, these products are "perfect": AVK,FPR,FSE,RAV,SCN The following products are "excellent": DRW,CMD *************************************************** And concerning macro-FP avoidance, this product is "perfect": SCN The following product is "excellent": RAV ******************************************************************** Eval LIN.07: Detection of File, Macro and Script Malware under LINUX ==================================================================== Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. In previous tests, malware detection was tested for file and macro malware; since test "2001-10", we also tested for FP-avoidance concerning script malware. As VTC tests were the first to include malware detection tests, we are very glad to observe that a growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file, macro and script malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" In comparison to last test (2001-10), where we report for the first time that 2 scanners = AVP and SCN = detected ALL specimen of file, macro and script malware, product behaviour has slightly deteriorated. Generally, detection of malware needs significant further development, as mean detection rates over platforms show: mean detection rate for file malware: 78.7% (89.3% for scanners >10%) for macro malware: 86.0% (98.3% for scanners >10%) for script malware: 52.8% (60.3% for scanners >10%) Concerning File,Macro&Script malware detection under LINUX: (file/macro/script) ------------------------------------------------------------------------ "Perfect" file,macro&script malware detectors: ==== NONE ==== "Excellent" file,macro&script malware detectors: AVK (97.7% 100.0% 95.7%) SCN (92.1% 99.8% 97.7%) "Very Good" file,macro&script malware detectors:RAV (86.1% 99.3% 82.1%) ------------------------------------------------------------------------ Concerning file malware detection only, NO product is rated "perfect" but 4 products are "excellent" and 2 products are rated "very good": ------------------------------------------------- "Perfect" file malware detectors: === NONE === "Excellent" file malware detectors: AVK ( 97.7%) CMD ( 92.5%) SCN ( 92.1%) FPR ( 91.4%) "Very Good" file malware detectors: FSE ( 89.9%) RAV ( 86.1%) ------------------------------------------------- Concerning macro malware detection only, 1 product is rated "perfect", and 7 more reach grade "excellent": ------------------------------------------------- "Perfect" macro malware detectors: AVK (100.0%) "Excellent" macro malware detectors: SCN ( 99.8%) CMD ( 99.3%) FPR ( 99.3%) RAV ( 99.3%) FSE ( 99.1%) DRW ( 91.1%) ------------------------------------------------- Concerning script malware detection only, NO product is rated "perfect", but 2 are rated "excellent" and 1 "very good": ------------------------------------------------- "Perfect" script malware detectors: === NONE === "Excellent" script malware detectors:SCN ( 97.7%) AVK ( 95.7%) "Very Good" script malware detectors:RAV ( 82.1%) ------------------------------------------------- ******************************************************************** Findings LIN.07: NO LINUX products can be rated "perfect" in detecting ALL file, macro & script malware specimen: *************************************************** 2 products are rated "excellent": AVK,SCN 1 product is rated "Very Good": RAV *************************************************** Concerning single classes of malware: A) "perfect" file malware detector: --- "excellent" file malware detector: AVK,CMD,SCN,FPR "very good" script malware detector: FSE,RAV B) "perfect" macro malware detector: AVK "excellent" macro malware detector: SCN,CMD,DRW,FPR,FSE,RAV "very good" macro malware detector: --- C) "perfect" script malware detector: --- "excellent" script malware detector: SCN,AVK "very good" script malware detector: RAV ******************************************************************* Eval LIN.SUM: Grading of LINUX products: ======================================== Under the scope of VTCs grading system, a "Perfect LINUX AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (5) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ******************************************************************* In VTC test "2002-12", we found *** NO perfect LINUX AV product *** and we found *** No perfect LINUX AM product *** ******************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------------ LINUX file ITW test: AVK,DRW,SCN RAV LINUX macro ITW test: AVK,DRW,SCN CMD,FPR,FSE,RAV LINUX script ITW test: AVK,CMD,DRW, FPR,FSE,RAV,SCN ------------------------------------------------------------------ LINUX file zoo test: --- AVK,SCN,CMD LINUX macro zoo test: SCN AVK,CMD,FPR,FSE,RAV,DRW LINUX script zoo test: --- SCN,AVK ------------------------------------------------------------------ LINUX file pack test: AVK,SCN DRW,RAV LINUX macro pack test: AVK,SCN,CMD,FPR DRW,RAV LINUX file FP avoidance: AVK,FPR,FSE,RAV,SCN CMD,DRW LINUX macro FP avoidance: SCN RAV ---------------------------------------------------------------- LINUX file malware test: --- AVK,CMD,SCN,FPR LINUX macro malware test: AVK SCN,CMD,FPR,RAV,DRW,FSE LINUX script malware test: --- SCN,AVK ---------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this LINUX test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" LINUX AntiVirus product: ===NONE=== (20 points) "Excellent" LINUX AV products: 1st place: SCN (18 points) 2nd place: AVK (15 points) 3rd place: DRW,RAV (10 points) 5th place: CMD,FPR ( 8 points) 7th place: FSE ( 6 points) ************************************************************ "Perfect" LINUX AntiMalware product:===NONE=== (24 points) "Excellent" LINUX AM products: 1st place: SCN (21 points) 2nd place: AVK (19 points) 3rd place: DRW,RAV (11 points) 5th place: CMD,FPR (10 points) 7th place: FSE ( 7 points) ************************************************************ ************************************************************* In addition, we regret to grade OAV into the "useless" class! =============================================================