Vulnerability Management

Vulnerability Scanners: Test methodology | Security Weekly Labs

Our aim is to engage with vendors as closely to an actual customer as possible. If a free trial is offered, we take it. If it is necessary to first engage with sales and request an account, we use the contact options provided on the website and wait for a reply, even if we already have contacts at the vendor.

However, while we engaged like a prospective customer would, we made no attempt to hide our identity or intentions at any point. We use real names and identify ourselves as employees of CyberRisk Alliance. We’re clear from the very beginning that we intend to perform product reviews and publicly publish the results. No compensation is requested or accepted for any of our reviews.

CyberRisk Alliance monetizes product reviews by licensing product reviews for redistribution after they have been published (commonly known as “reprint rights”). It does occur to us that positive reviews are more likely to sell reprints. We believe that enough vendors are interested in an honest, independent, and unbiased review that we don’t have to worry about making everyone happy. With that said, our reviews will be as polite and fair as we can make them.

We try to establish testing methodologies and share them with vendors before testing begins. However, it isn’t always possible to make testing methodologies available to vendors with new categories. It’s necessary to spend some time with the full range of products to understand the bounds of the categories and how to measure their performance. On the topic of performance, our reviews intentionally highlight product features and the customer experience over technical performance. We believe that technical performance, while important, shouldn’t be the focus at the expense of other product attributes.

Finally, vendors are given an opportunity to review drafts before publication. The purpose of this is to ensure the content of our reviews is factually correct, fair and doesn’t include any information protected under NDA. We are clear to vendors that this is not an opportunity to insert marketing copy or rewrite our reviews. Any attempt to do so is ignored.

Vulnerability scanner testing

An important note up front. We’ve decided to split up vulnerability management into two separate reviews. This review will cover commercial and open-source network vulnerability scanners. In a few months, we’ll revisit vulnerability management to cover products that focus on analysis and remediation of vulnerability scanning results (vendors like Kenna Security, Vicarius, Vulcan Cyber, etc.).

Our test networks are comprised of approximately 100 devices that vary from rack-mounted enterprise gear to workstations, virtual machines, network devices, and IoT devices. We’ve also placed some intentional vulnerabilities and older out-of-support devices in the environment to see how well each product identifies these. That said, we’re not doing a vulnerability-by-vulnerability or false positive performance assessment. The closest we’re going to get to assessing performance is to look at each product’s ability to properly identify each device.

Without proper identification, you don’t know what you have. If you don’t know what you have, there’s a good chance that any other steps you take will be in the wrong direction.

We will also be assessing:

  • the deployment process
  • setting up and running scans
  • ease of troubleshooting scanner issues
  • vulnerability findings
  • reporting
  • integrations (first and third party)
  • data export
  • general UI/UX usability and workflow
  • multi-user/RBAC/sharing features

We’ve decided to focus on internal network scans for this review, as it is the core and most comparable feature set for these products. Unless otherwise noted, we’ll be using default scan profiles.

What we’re not testing now, but are interested to test in the future:

All products tested are also capable of performing external vulnerability scans as well, either via self-hosted scanning engines, or scanning engines hosted by the vendor themselves. When we reviewed Attack Surface Management (ASM) products, we noted that many ASM tools could potentially replace external vulnerability scans soon. If you’d be interested to see us revisit that idea with a head-to-head performance test in the future, let us know.

F-Secure's hosted scan engines, facilitating external vulnerability scans

Several products also feature the ability to scan containers and cloud workloads. Again, we’re not going to test these capabilities in this set of reviews, as there are also specialized products that focus entirely on vulnerability management in public cloud and containers. We’ll assess these capabilities in a separate test.

Defining value

For all product tests, it is necessary to define a tangible “value” to derive some of the metrics we use to evaluate products. Ideally (for us), value would be defined the same for each product within a particular category. However, many products have unique features and key differentiators that may result in a different definition of "value" from their competitors.

The value of vulnerability scanners is derived the scanner’s ability to go through the following process with the least number of false positives and negatives possible:

  • Actively discover assets, via active network scans, credentialed scans, or agents
  • Correctly identify asset types
  • Enumerate ports and services
  • Determine vulnerabilities present on each host
  • Accurately evaluate the risk represented by these assets, noting issues that should be addressed

The other key factor that goes into defining value is human labor. The five-step process outlined above should occur with as little configuration and validation work from operators as possible. Put another way, we view any analyst time saved as an increase in the value of the product. Products that report higher numbers of false positives or non-critical vulnerabilities will naturally result in more labor necessary to review and validate findings.

Looking at findings from a different perspective, accuracy is also important to value. Many organizations have responded to false negatives by buying an additional vulnerability scanning product to run alongside the first. This may result in more comprehensive results, but also increases the labor necessary to implement and maintain another product. It will also potentially generate double the number of findings to analyze and validate. The next logical step is to purchase a third vulnerability management solution to do the work of deduplicating and prioritizing the output of the first two. The idea here is to save on labor by taking a larger hit to capital expenses.

We’ll review this category of products that automatically analyze the output of vulnerability scanners a few months after this set of reviews come out, and we’ll be interested to see whether the investment is worth it when it comes to value.

Metrics

Time-to-value is a metric that describes the amount of time it generally takes to get a product from zero to fully deployed and producing value. The clock for this metric begins when the vendor provides access to the product (e.g. an account to a SaaS product or license key + software download).

Labor-to-value is a metric that expresses the effort necessary to keep the product at a level of performance where it is providing value consistently.

True Cost is a metric that expresses the total cost of a product, including capital expenditures, operational expenditures, and labor costs. It is effectively product cost + initial deployment cost + maintenance costs, where the following labor cost assumptions are used. We’ve listed salaries along with the actual cost of the employee to the employer, based on the U.S. Small Business Administration’s most conservative estimate (1.4x of salary). We calculate hourly rates by dividing the actual cost of the employee by 2080 hours (52 weeks multiplied by 40 hour work weeks).

  • Junior Security Analyst Salary: $50K USD ($70K) - $33.65/hr
  • Security Analyst Salary: $75K USD ($105K) - $50.48/hr
  • Senior Security Analyst Salary: $100K USD ($140K) – 67.31/hr

To use this in an example, a one-hour meeting with two senior security analysts and two junior security analysts costs their employer $201.92.

Testing transparency

Seeing as how we’re dependent on the ability to test products for these reviews to happen, we’ve been swayed by Transparency in Cyber’s mission. This organization aims to convince cybersecurity vendors to remove clauses from their EULAs restricting the right to test or review their software and openly share the results. Transparency isn’t just a core tenet in cybersecurity, but across products in general. Imagine if independent reviewers weren’t allowed to test and share the 0-60 times of sports cars, or benchmarks of the latest CPUs.

We’ll be investigating vendor EULAs and reporting on whether there’s any language restricting reviews or sharing testing results. From what we’ve seen so far, testing isn’t always explicitly prohibited, but is allowed with pre-written authorization. Whatever the case, we’ll detail our findings (and where we found them) in each individual review.

Adrian Sanabria

Adrian is an outspoken researcher that doesn’t shy away from uncomfortable truths. He loves to write about the security industry, tell stories, and still sees the glass as half full.

Get daily email updates

SC Media's daily must-read of the most current and pressing daily news

By clicking the Subscribe button below, you agree to SC Media Terms of Use and Privacy Policy.

You can skip this ad in 5 seconds