================================================ File 7lEVALIN.TXT: ------------------------------------------------ Evaluation of results for File, Macro and Script Virus/Malware detection under LINUX in VTC Test "2004-07": ================================================ Formatted with non-proportional font (Courier) Content of this file: ===================== Eval A: Development of detection rates under Linux: ************************************************************************* Eval LIN.01: Development of Linux Scanner Detection Rates Table LIN.A1: File Virus Detection Rate in last 4 VTC tests LIN.A2: Macro/Script Virus Detection Rate in last 4 VTC tests Eval LIN.02: In-The-Wild Detection under Linux Eval LIN.03: Evaluation of overall Linux AV detection rates Eval LIN.04: Evaluation of detection by virus classes under Linux LIN.04.1 Grading the Detection of file viruses LIN.04.2 Grading the Detection of macro viruses LIN.04.3 Grading the Detection of script viruses Eval LIN.05: Detection of Packed Macro Viruses under Linux + AND Loss of Virus Detection through Packing Eval LIN.06: Avoidance of False Alarms (File, Macro) under Linux Eval LIN.07: Detection of File, Macro and Script Malware under Linux Eval LIN.SUM Grading of LINUX products ************************************************************************* This part of VTC "2004-07" test report evaluates the detailed results as given in sections (files): 6iLINUX.TXT File/Macro/Script Viruses/Malware results LINUX The following *11* products participated in this scanner test for Linux products: =============================================================== Products submitted for aVTC test under Linux (SuSe 7.0): =============================================================== ANT = Antivir v:2.0.6-22 H+B EDV Datentechnik Germany AVP = Kaspersky Anti-Virus (KAV), v:4.0.3.0 Kaspersky Lab Russia CLA = CLAM AntiVirus v:0.54 CMD = Command Antivirus v:4.75.0 Command Software Systems USA DRW = Dr. Web v:4.29.7 DialogueScience Russia FPR = F-PROT v:3.13 Frisk Software Iceland FSE = F-SECURE v:4.50 build 2092 F-Secure Corporation Finland INO = eTrust AV v:23.59.00 Computer Associates USA OAV = Open AntiVirus Open Antivirus Project SCN = McAfee ViruScan v4.24.0 Network Associates USA SWP = Sophos AV v:3.67 Sophos UK =============================================================== Eval LIN.01: Development of Linux Scanner Detection Rates: ========================================================== Linux-based scanners were tested in "2004-07" for the 3rd time, this time again (as in 1st test) for file, macro and script malware detection. This time, 11 scanners were available for tests, 2 more than in last test. Table Lin.A: Performance of LINUX scanners in Test 2001-04 until 2004-07: ========================================================================= Table Lin.A1: Detection performance for file viruses: =================================== Scan I ==== File Virus ==== ner I Detection -----+---------------------- Test I 0104 0212 0407 Delta -----+---------------------- ANT I - - 90.9 - AVK I - 99.9 - - AVP I 99.9 - 100~ - CLA I - - 34.8 - CMD I 97.8 99.1 99.1 0.0 DRW I - 98.3 79.0 -19.3 FPR I - 98.9 99.6 +0.7 FSE I 97.1 98.0 100~ +2.0 INO I - - 95.4 - MCV I - - - - OAV I - 9.1 34.1 +25.0 RAV I 93.5 96.7 - - SCN I 99.7 99.8 100~ +0.2 SWP I - - 98.2 - -----+---------------------- Mean : 97.6 87.5 84.7 -2.8 >10% : 97.6 98.7 84.7 -14.0 -----+---------------------- Table Lin.A2: Detection performance for macro and script viruses: =========================================== Scan I ===== Macro Virus ====== + ===== Script Virus ====== ner I Detection I Detection -----+--------------------------+-------------------------- Test I 0104 0110 0212 0407 DeltaI 0104 0110 0212 0407 Delta -----+--------------------------+-------------------------- ANT I - 97.1 - 97.9 - I - 81.8 - 87.6 - AVK I - 100% 100~ - - I - 100% 99.1 - - AVP I 100~ 100% - 100~ - I 99.8 100% - 99.7 - CLA I - - 0.5 - I - - - 27.1 - CMD I 100% 100% 100~ 99.9 -0.1 I 96.9 94.2 89.4 99.4 +10.0 DRW I - 99.5 99.4 99.4 0.0 I - 95.4 94.7 95.4 +0.7 FPR I - - 100~ 99.9 -0.1 I - - 88.7 99.4 +10.7 FSE I 100~ 100% 100~ 100% +0.0 I 96.9 92.3 88.1 99.9 +11.8 INO I - - 99.3 - I - - - 96.6 - MCV I - 9.1 - - - I - 27.6 - - - OAV I - - 0.1 0.1 0.0 I - - 13.9 27.1 +13.2 RAV I 99.6 99.5 99.9 - - I 84.9 82.5 96.1 - - SCN I 100% 100% 100% 100~ -0.0 I 99.8 99.8 99.5 99.8 +0.3 SWP I - - 99.8 - I - - - 97.2 - -----+--------------------------+-------------------------- Mean : 99.9 89.5 74.9 81.6 +6.7 I 95.7 86.0 83.7 84.5 +0.8 >10% : 99.9 99.5 99.8 99.6 -0.2 I 95.7 86.0 83.7 84.5 +0.8 -----+------------------------+---------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Evaluation: Zoo file virus detection is stable for most scanners but mean value is reduced thru 3 inacceptable results (84.7%) Zoo macro virus detection is stable on a very high level (99.6%) when a new product with inacceptably bad rate is not counted. Zoo script detection has significantly improved (84.5%) but on an insufficient level. ******************************************************************** Findings LIN.01: Concerning zoo virus detection, LINUX products are less developped as compared to other platforms. Detection rates for file viruses are significantly reduced (in zoo: 84.7%) for macro viruses still high (in zoo: 99.6%) for script viruses improved (in zoo: 84.5%) --------------------------------------------------- NO product detects ALL zoo file viruses, 3 products detect almost 100% (grade: excellent): AVP,FSE,SCN (100~) 2 products detect >99% (grade: excellent): FPR (99.6%), CMD (99.1%) 2 more products detect >95% (grade: very good): SWP (98.2%), INO (95.4%) --------------------------------------------------- 1 product detects ALL zoo macro viruses (perfect): FSE (100.0%) 2 products detect almost all zoo macro viruses: (100~, grade excellent): AVP,SCN (100~) 5 products detect >99% (grade: excellent): FPR,CMD(99.9%),SWP(99.8%),DRW(99.4%),INO(99.3%) 1 product detects >95% (grade: very good): ANT(97.9%) --------------------------------------------------- NO product detects ALL zoo script viruses, 5 products detect >99% (grade: excellent): FSE(99.9%),SCN(99.8%),AVP(99.7%),CMD,FPR(99.4%) 3 products detect >95% (grade: very good): SWP(97.2%),INO(96.6%),DRW(95.4%) --------------------------------------------------- As "worst products in test" qualify (for 2nd time): CLA: file virus detection rate: 34.8% macro virus detection rate: 0.5% script virus detection rate: 27.1% OAV: file virus detection rate: 34.1% macro virus detection rate: 0.1% script virus detection rate: 27.1% ******************************************************************** Eval LIN.02: In-The-Wild (File,Macro,Script) Detection under LINUX ================================================================== Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for macro and script viruses (it must be observed that detection and identification is not completely reliable). Presently, different from last test where 3 products (AVK, DRW and SCN) detected ALL samples of ALL ITW file, macro and script viruses, detection is significantly reduced as only ONE product - SCN - is "perfect". 2 products are "excellent" as they detect ALL viruses except one and related infected objects in ALL classes. ITW virus/file detection ( FileV. MacroV. ScriptV. ) -------------------------------------- "perfect" LINUX ITW scanner: SCN (100% 100%; 100% 100%; 100% 100%) "excellent" LINUX ITW scanner: AVP (99.1% 99.8%; 100% 100%; 100% 100%) FSE (99.1% 99.8%; 100% 100%; 100% 99.4%) -------------------------------------- Concerning detection of ITW file viruses, 1 scanner is "perfect": SCN (100% 100%) And 7 products detect all ITW viruses EXCEPT 1 and are regarded as "excellent": CMD (100% 99.2%), AVP,DRW,FPR,FSE (99.1% 99.8%) INO,SWP (99.1% 99.5%) Concerning detection of macro ITW viruses, 5 product detect ALL viruses and 3 products detect ALL viruses except one sample: 5 scanners are rated "perfect": ANT,AVP,FSE,SCN,SWP (100% 100%) 3 scanners are rated "excellent": CMD,FPR,INO (100% 99.9%) Concerning detection of ITW script viruses, 2 products detect ALL viruses and 3 products detect ALL viruses except few samples and are graded "excellent". 2 scanners are rated "perfect": AVP,SCN (100% 100%) 3 scanners are rated "excellent": CMD,FPR,FSE (100% 99.4%) ******************************************************************** Findings LIN.02: ONE AV product detects "perfectly" all ITW file macro and script viruses in all files (last test:3): SCN *************************************************** Concerning detection of ITW file viruses: 1 "perfect" scanner: SCN 7 "excellent" scanners: CMD,AVP,DRW,FPR,FSE,INO,SWP Concerning detection of ITW macro viruses: 5 "perfect" scanners: ANT,AVP,FSE,SCN,SWP 3 "excellent" scanner: CMD,FPR,INO Concerning detection of ITW script viruses: 2 "perfect" scanners: AVP,SCN 3 "excellent" scanners: CMD,FPR,FSE ******************************************************************** Eval LIN.03: Evaluation of overall LINUX AV detection rates (Zoo,ITW) ===================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file, macro and script virus detection rates in unpacked samples. As discussed before, only one product - SCN - perfectly detects ITW viruses where as few others miss one virus. Zoo test: ITW test: (file/macro/script; file/macro/script) -------------------------------------- "Excellent" LINUX scanners: SCN (100~ 100~ 99.8%; 100% 100% 100%) AVP (100~ 100~ 99.7%; 99.1% 100% 100%) FSE (100~ 100% 99.9%; 99.1% 100% 100%) ------------------------------------- Concerning insufficient detection (<50%), we have to assign the grade "useless" to OAV (=Open Anti Virus) and CLA (=Clam AntiVirus) wich both perform TOTALLY INACCEPTABLE: ------------------------------------- "Useless" LINUX scanners: OAV (34.8% 0.5% 27.1%; 45.8% 4.1% 45.5%) CLA (34.1% 0.1% 27.1%; 35.5% 0.0% 45.5%) ------------------------------------- ********************************************************************* Findings LIN.03: 3 LINUX products overall rated "excellent": SCN,AVP,FSE ------------------------------------------------------- 2 "Useless" overall LINUX scanners: CLA,OAV ********************************************************************* Eval LIN.04: Evaluation of detection by virus classes under LINUX: ================================================================== Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting macro and script viruses. Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. LIN.04.1 Grading the Detection of file zoo viruses under LINUX: --------------------------------------------------------------- "Perfect" LINUX file scanners: === NONE === "Excellent" LINUX file scanners: AVP ( 100~ ) FSE ( 100~ ) SCN ( 100~ ) FPR ( 99.6%) CMD ( 99.1%) "Very Good" LINUX file scanners: SWP ( 98.2%) INO ( 95.4%) LIN.04.2 Grading the Detection of macro zoo viruses under LINUX: ---------------------------------------------------------------- "Perfect" LINUX macro scanners: FSE ( 100% ) "Excellent" LINUX macro scanners: AVP ( 100~ ) SCN ( 100~ ) CMD ( 99.9%) FPR ( 99.9%) SWP ( 99.8%) DRW ( 99.4%) INO ( 99.3%) "Very Good" LINUX macro scanners: ANT ( 97.9%) LIN.04.3 Grading the Detection of Script zoo viruses under LINUX: ----------------------------------------------------------------- "Perfect" LINUX script scanners: === NONE === "Excellent" LINUX script scanners: FSE ( 99.9%) SCN ( 99.8%) AVP ( 99.7%) CMD ( 99.4%) FPR ( 99.4%) "Very Good" LINUX script scanners: SWP ( 97.2%) INO ( 96.6%) DRW ( 95.4%) ******************************************************************** Finding LIN.04: Performance of LINUX scanners by virus classes: ---------------------------------------------------- Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVP,FSE,SCN,FPR,CMD Very Good scanners for file zoo: SWP,INO Perfect scanners for macro zoo: FSE Excellent scanners for macro zoo: AVP,SCN,CMD,FPR,SWP,DRW,INO Very Good scanners for macro zoo: ANT Perfect scanners for script zoo: --- Excellent scanners for script zoo: FSE,SCN,AVP,CMD,FPR Very Good scanners for script zoo: SWP,INO,DRW ******************************************************************** Eval LIN.05: Detection of Packed File and Macro Viruses under LINUX: ==================================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods are also detected. The following 6 packers were used in this tests: PKZIP, ARJ, LHA, RAR 1.5, WinRAR 3.0 and CAB. ********************************************************** No product detected ITW samples packed with ALL 6 packers. ********************************************************** Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. Following analysis of detection of ITW file AND macro viruses (see LIN.02), the following grading is applied: "perfect": 100% detection for ITW file AND macro viruses packed with 6 packers "excellent": 100% detection for ITW file AND macro viruses packed with 5 packers "very good": 100% detection for ITW file AND macro viruses packed with 4 packers "good": 100% detection for ITW file AND macro viruses packed with 3 packers Concerning overall detection of ITW file AND macro viruses, only 1 product detects ALL packed ITW objects with 5 packers ("excellent"): SCN And 1 product detects ALL packed ITW objects with 3 packers ("good"): CMD Detection of packed ITW viruses is very well developped for macro viruses: - 2 products detect 100% ITW viruses with 6 packers: AVP,FSE - 3 products detect 100% ITW viruses with 5 packers: DRW,SCN,SWP - 3 products detect 100% ITW viruses with 4 packers: CMD,FPR,INO In significant contrast, detection of packed ITW file viruses is lesser developped: - NO product detects 100% ITW file viruses with 6 packers: --- - 1 product detects 100% ITW viruses with 5 packers and is rated "excellent": SCN - 1 product detects 100% ITW viruses with 3 packers and is rated "good": CMD Remark: several products miss grades "excellent" or "very good" by small margin as they miss detecting just ONE ITW virus. Few other products miss 1 ITW file virus both unpacked and packed, but detect those reliably: AVP,FSE (6 packers), DRW,DWP (5), FPR(4) A "perfect" product would detect ALL packed ITW file/macro viruses (100%) for 6 packers: ------------------------------------------------------- "Perfect" packed file/macro virus detectors: --- ------------------------------------------------------- An "excellent" product would detect ALL packed ITW file/macro viruses (100%) for 5 packers: ------------------------------------------------------- "Excellent" packed file/macro virus detector: SCN "Good" packed file/macro virus detector: CMD ------------------------------------------------------- Detection of unpacked and packed ITW viruses: (L) Loss of detection rate? ------------------------------------------------------------------------- One NEW table analyses whether all ITW file/macro viruses which a product detects in UNPACKED form is also detected when PACKED with one of 6 packers. Related tables show that some scanners detect ALL ITW viruses RELIABLY both in unpacked and packed forms. But some scanners show a significant LOSS of detection for file and macro viruses (tables LIN.F3L and LIN.M3L). The following products have NO LOSS of their detection ability for packed ITW viruses and are regarded "perfect" in this sense: "Perfect scanners": No LOSS in detection of ALL ITW viruses thru 6 packers: AVP No LOSS in ITW file virus detection thru 6 packers: AVP No LOSS of ITW macro virus detection thru 6 packers: AVP "Excellent scanners": No LOSS in detection of ALL ITW viruses thru 5 packers: SCN,SWP No LOSS in ITW file virus detection thru 5 packers: SCN,SWP No LOSS in ITW macro virus detection thru 5 packers: SCN,SWP "Good scanners": No LOSS in detection of ALL ITW viruses thru 3 packers: CMD,FPR No LOSS in ITW file virus detection thru 3 packers: CMD,FPR No LOSS in ITW macro virus detection thru 3 packers: CMD,FPR Comment: one product - DRW - detects 5 viruses when packed which it doesNOT detect in unpacked form. Consequently, the "LOSS of detection thru packing" becomes negative: -5. *********************************************************************** Findings LIN.05: Detection of packed viral objects needs improvement: Perfect packed ITW file/macro virus LINUX detector: --- Excellent packed ITW file/macro virus LINUX detector: SCN ----------------------------------------------------------------- Concerning detection of packed FILE viruses: NO product is "perfect": --- 1 product is "excellent": SCN ******************************************************* Concerning detection of packed MACRO viruses: 1 product is "perfect": AVP,FSE 5 products are "excellent": DRW,SCN,SWP ******************************************************* Concerning EQUAL detection of UNPACKED AND PACKED of ITW file and macro viruses: 1 "perfect" product has NO LOSS for 6 packers: AVP 2 "excellent" products have NO LOSS for 5 packers: SCN,SWP *********************************************************************** Eval LIN.06: Avoidance of False Alarms (File/Macro) under LINUX: ================================================================ First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate <=0.6%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" It is good to observe that 10 (of 11) scanners avoid FP alerts on clean files, but concerning clean macro objects, only 6 (out of 11) products are "perfect" in avoiding any alarm, and 4 more products are "excellent" as they only alert on at most 2 samples (<=0.6%). Remark: the testbed included 2 CLEAN (non-viral, non-malicious) macro objects which were taken from a goat generator. While there is no reason for alerting on such samples, SOME AV experts argue that these samples must be detected as they may be used by VX groups. Indeed, some scanner diagnosed those samples as "infection" which is misleading and this is counted as "false positive diagnosis" (a warning would be acceptable). Regarding the ability of scanners to avoid FP alarms, 4 (of 11) products reported NO SINGLE False Positive alarm in file and macro zoo testbeds and are therefore rated "perfect": -------------------------------------------------------------- 4 "Perfect" file AND macro FP-avoiding LINUX scanners: ANT,INO,SCN,SWP -------------------------------------------------------------- Concerning avoidance of false positive alarms on file viruses: -------------------------------------------------------------- 8 "Perfect" file FP-avoiding LINUX scanners: ANT,AVP,DRW,FPR,FSE,INO,SCN,SWP 1 "Excellent" file FP-avoiding LINUX scanner: CMD -------------------------------------------------------------- Concerning avoidance of false positive alarms on macro viruses: ----------------------------------------------------------------- 4 "Perfect" macro FP-avoiding LINUX scanners: ANT,INO,SCN,SWP 4 "Excellent" macro FP-avoiding LINUX scanner: AVP,CMD,FPR,FSE ----------------------------------------------------------------- (In addition, OAV and CLA didnot report any false alarm, but with low detection rate, this canNOT be regarded for grading). ******************************************************************** Findings LIN.06: Avoidance of False-Positive Alarms is insufficient and needs improvement. FP-avoiding perfect LINUX scanners: ANT,INO,SCN,SWP *************************************************** Concerning file-FP avoidance, these 8 products are "perfect": ANT,AVP,DRW,FPR,FSE,INO,SCN,SWP The following product is "excellent": CMD *************************************************** Concerning macro-FP avoidance, 4 products are "perfect": ANT,INO,SCN,SWP 4 product are "excellent": AVP,CMD,FPR,FSE ******************************************************************** Eval LIN.07: Detection of File, Macro and Script Malware under LINUX ==================================================================== Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non- viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. In previous tests, malware detection was tested for file and macro malware; since test "2001-10", we also tested for FP-avoidance concerning script malware. As VTC tests were the first to include malware detection tests, we are very glad to observe that a growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file, macro and script malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" In comparison to last test (2003-04), where we reported for the first time that 2 scanners = AVP and SCN = detected ALL specimen of file, macro and script malware, product behaviour has deteriorated significantly. This is probably due to the strong growth esp. of (non-viral) file malware. In contrast, macro malware - which develops least - is perfectly detected by 3 products. Generally, detection of malware needs significant further development, as mean detection rates over platforms show: mean detection rate for file malware: 67.7% (81.7% for scanners >10%) for macro malware: 79.4% (97.0% for scanners >10%) for script malware: 69.4% (83.3% for scanners >10%) Concerning File,Macro&Script malware detection under LINUX: (file/macro/script) ------------------------------------------------------------------------ "Perfect" file,macro&script malware detectors: ==== NONE ==== "Excellent" file,macro&script malware detectors: FSE (99.9% 100% 99.1%) AVP (99.4% 100% 98.2%) SCN (97.8% 99.8% 98.2%) "Very Good" file,macro&script malware detectors:FPR (96.8% 100% 89.4%) CMD (94.8% 99.6% 88.2%) ------------------------------------------------------------------------ Concerning file malware detection only, NO product is rated "perfect" but 6 products are "excellent" and NO products are rated "very good": ------------------------------------------------- "Perfect" file malware detectors: === NONE === "Excellent" file malware detectors: FSE ( 99.9%) AVP ( 99.4%) SCN ( 97.8%) FPR ( 96.8%) CMD ( 94.8%) SWP ( 90.8%) "Very Good" file malware detectors: === NONE === ------------------------------------------------- Concerning macro malware detection only, 3 products are rated "perfect", and 4 more reach grade "excellent": ------------------------------------------------- "Perfect" macro malware detectors: AVP ( 100% ) FPR ( 100% ) FSE ( 100% ) "Excellent" macro malware detectors: SCN ( 99.8%) CMD ( 99.6%) DRW ( 99.0%) SWP ( 98.4%) "Very Good" macro malware detectors: INO ( 89.7%) ANT ( 86.4%) ------------------------------------------------- Concerning script malware detection only, NO product is rated "perfect", but 2 are rated "excellent" and 1 "very good": ------------------------------------------------- "Perfect" script malware detectors: === NONE === "Excellent" script malware detectors:FSE ( 99.1%) AVP ( 98.2%) SCN ( 98.2%) "Very Good" script malware detectors:FPR ( 89.4%) CMD ( 88.2%) ------------------------------------------------- ********************************************************************** Findings LIN.07: NO LINUX product can be rated "perfect" in detecting ALL file, macro & script malware specimen: NO product is rated "Overall Perfect": --- 3 products are rated "Overall excellent": FSE,AVP,SCN 2 products are rated "Overall Very Good": FPR,CMD ****************************************************** Concerning single classes of malware: A) "perfect" file malware detector: --- "excellent" file malware detector: FSE,AVP,SCN,FPR,CMD,SWP "very good" script malware detector: --- B) "perfect" macro malware detector: AVP,FPR,FSE "excellent" macro malware detector:SCN,CMD,DRW,SWP "very good" macro malware detector: INO,ANT C) "perfect" script malware detector: --- "excellent" script malware detector: FSE,AVP,SCN "very good" script malware detector: FPR,CMD *********************************************************************** Eval LIN.SUM: Grading of LINUX products: ======================================== Under the scope of VTCs grading system, a "Perfect LINUX AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and +2A) Will detect ALL ITW samples both in unpacked instantiations AND packed with ALL (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ******************************************************************* In VTC test "2004-03", we found *** NO perfect LINUX AV product *** and we found *** No perfect LINUX AM product *** ******************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------------ LINUX file ITW test: SCN CMD,AVP,DRW,FPR, FSE,INO,SWP LINUX macro ITW test: ANT,AVP,FSE,SCN,SWP CMD,FPR,INO LINUX script ITW test: AVP,SCN CMD,FPR,FSE ------------------------------------------------------------------ LINUX file zoo test: --- AVP,FSE,SCN,FPR,CMD LINUX macro zoo test: FSE AVP,SCN,CMD,FPR, SWP,DRW,INO LINUX script zoo test: --- FSE,SCN,AVP,CMD,FPR ------------------------------------------------------------------ LINUX file pack test: --- SCN LINUX macro pack test: AVP,FSE DRW,SCN,SWP + LINUX pack/unpack test: AVP SCN,SWP ------------------------------------------------------------------ LINUX file FP avoidance: ANT,AVP,DRW,FPR, CMD FSE,INO,SCN,SWP LINUX macro FP avoidance: ANT,INO,SCN,SWP AVP,CMD,FPR,FSE ------------------------------------------------------------------ LINUX file malware test: --- FSE,AVP,SCN,FPR,CMD,SWP LINUX macro malware test: AVP,FPR,FSE SCN,CMD,DRW,SWP LINUX script malware test: --- FSE,AVP,SCN ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this LINUX test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" LINUX AntiVirus product: ===NONE=== (22 points) "Excellent" LINUX AV products: 1st place: SCN (16 points) 2nd place: AVP (15 points) 3rd place: FSE (13 points) 4th place: SWP (10 points) 5th place: FPR ( 9 points) 6th place: CMD ( 8 points) 7th place: INO ( 7 points) 8th place: ANT ( 6 points) 9th place: DRW ( 5 points) LAST PLACE: CLA,OAV ( 0 points) ************************************************************ "Perfect" LINUX AntiMalware product:===NONE=== (28 points) "Excellent" LINUX AM products: 1st place: AVP,SCN (19 points) 3rd place: FSE (17 points) 4th place: FPR,SWP (12 points) 6th place: CMD (10 points) 7th place: DRW ( 6 points) ************************************************************ ********************************************************************* In addition, we regret to grade CLA and OAV into the class of "useless" products! =====================================================================