How to Scan Large Storage Volumes for Malware Without Impacting Performance

In a business environment, security scans on work computer hard drives are usually performed automatically and without interruption. However, when it comes to servers storing terabytes of data, the situation becomes more complicated, especially if an emergency scan is triggered following a security incident. In these cases, it is essential to ensure data security without affecting system performance.

Below are a series of strategies to optimize scans of large data volumes, minimizing risks and execution times.


1. Pre-scan Preparation

Before initiating a mass storage scan, it is important to conduct a series of preliminary checks to avoid errors and improve the efficiency of the process.

Infrastructure Verification

  • Ensure that the operating system is up to date and compatible with the storage infrastructure.
  • Confirm that the server executing the scan has a powerful multi-core processor, sufficient RAM, and fast storage for temporary files.
  • Ensure a fast connection to the disks using high-performance interfaces, preferably a SAN (Storage Area Network).

Backup Check

Even though a scan should not modify stored data, it is always advisable to have a recent backup. It is wise to:

  • Check the date and status of the most recent backups.
  • Evaluate the possibility of creating a new backup before the scan if no updated versions exist.
  • Confirm that data recovery procedures have been tested and work correctly.

Data Type Analysis

The type of data stored influences the scan’s impact on the system. If the disks contain:

  • Heterogeneous files and compressed files → These will require more resources since each file will need to be analyzed individually.
  • Large volumes of files in secure formats (videos, databases, intact backups) → The load will be lighter and specific exclusions can be applied.

Based on this assessment, it can be determined whether it is advisable to run scans in parallel on different disks to expedite the process.


2. Scan Configuration

Strategic Scheduling of the Scan

To avoid affecting the company’s operations, it is advisable to schedule the scan during off-peak hours, such as at night or on weekends. If this is not possible, it is essential to notify users about potential system slowdowns.

Disk Space and Quarantine

  • Verify that there is sufficient free disk space for temporary files and for quarantine in case infected files are found.
  • Adjust the quarantine storage limits to prevent the deletion of files in the event of a mass threat detection.

Exclusion Optimization

To reduce scan times without compromising security, it is recommended to exclude:

  • Extremely large files (over several gigabytes).
  • Software distributions and backups that have not been modified since the last scan.
  • Non-executable files, though cautiously, as some text or image files may contain malicious fragments.

It is also advisable to delete temporary files and unnecessary folders before the scan to avoid wasting time analyzing irrelevant data.


3. Key Adjustments to Improve Performance

The scan settings can be optimized based on system load and security priority:

  • Resource Allocation: If the server is not in use during the scan, up to 80% of the CPU and memory can be allocated. If it must continue operating, this percentage should be significantly lower.
  • Activate Optimization Technologies: Features like iChecker and iSwift in security solutions allow for skipping files that haven’t changed since the last scan.
  • Avoid Password Prompts for Protected Files: If the scan stops due to an access key being required for compressed files, it can cause unnecessary blocks.
  • Set Heuristic Analysis to Medium Level: This detects threats without excessively increasing system load.
  • Establish Detailed Event Logging: To identify issues, the log should contain clear information about scanned files and results.

4. Executing the Scan and Real-Time Monitoring

Before conducting a full scan, it is advisable to perform a test on a small partition (no more than one terabyte) to assess:

  • Estimated execution time.
  • Impact on server performance.
  • Possible errors in the log.

If the test scan takes too long, the logs can be reviewed to identify bottlenecks and adjust configurations before proceeding with a broader analysis.

For large volumes of data, rather than executing a single massive scan, it is better to divide the analysis into multiple separate tasks by disks or directories. This allows for:

  • Running scans in parallel if the infrastructure supports it.
  • Reducing the risk of interruptions in case a scan fails.
  • Optimizing resource management to avoid system overload.

During the scan execution, it is crucial to monitor system performance in real time and pay attention to any anomalies that require immediate intervention.


Conclusion: Balancing Security and Performance

Scanning large volumes of data for malware is a critical process that must be conducted with planning and optimization to avoid affecting enterprise performance. By applying these steps, organizations can maintain the security of their storage without compromising the functionality of their systems.

With the combination of an adequate infrastructure, optimized adjustments, and controlled execution, it is possible to detect and mitigate threats without generating unnecessary downtime. In an environment where data is the most valuable asset, ensuring its protection without sacrificing performance is the key to success.

Scroll to Top