hoernchenmeister
Joined: 20 May 2010 |
Posts: 0 |
|
|
 |
Posted: Fri May 21, 2010 6:31 am |
|
 |
 |
 |
 |
Good morning all,
I am doing email scans with commandline calls using C#.
Everything works fine so far, I just stumbled into a question regarding performance when scanning single files repeadingly.
The call uses the following arguments for single files:
--database=C:\ProgramData\.clamwin\db" --recursive c:\somedirectory\someemail.eml
and this one for directories:
--database=C:\ProgramData\.clamwin\db" --recursive c:\somedirectory
The directory scan performs fine and scans about 80 files in 5 sec, while on the other hand the single file scan takes about 12 sec for a file.
I assume this has something to do with loading the virus definitions before scanning a file each time, while the directory scan loads the definitions once and uses them for all files withing the directory.
Does anybody has an advice or a suggestion on how to improve single file scan performance?
...or point me into a direction on how to handle this?
Any help is kindly appreciated,
best regards
Andy
|
|
GuitarBob
Joined: 09 Jul 2006 |
Posts: 9 |
Location: USA |
|
 |
Posted: Tue Feb 15, 2011 1:24 pm |
|
 |
 |
 |
 |
Yes, ClamWin must load the virus signature database before each scan. In addition, it uses the clamscan executable for scanning, instead of the more functional clamdscan. ClamWin really needs a re-write of the code for improvement in this area. Until then, if you are scanning all files, it might help (only a small bit) if you confine your scans to look at only the 50 or so file extensions most likely to harbor malware, and also scan only the most popular directories for malware--primarily system 32, users/documents and settings.
Regards,
|
|