By Dan Cornell
Last Thursday I gave a talk at RSA 2012 titled “Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?” We’ve been talking for a while about the need for a greater focus on remediation. Organizations are getting pretty good at “finding” vulns, but most still have a long way to go toward “fixing” them. This talk presents the results of a number of remediation engagements we’ve executed on with the hope that the lessons we’ve learned can be used by other organizations.
Slides are online here:
The abstract for the talk was:
For the security industry to mature more data needs to be available about the true cost of security vulnerabilities. Data and statistics are starting to be released, but most of this currently focuses on the prevalence of different types of vulnerabilities and incidents rather than the costs of addressing the underlying issues. This session presents statistics from the remediation of 15 web-based applications in order to provide insight into the actual cost of remediating application-level vulnerabilities.
The presentation begins by setting out a structured model for software security remediation projects so that time spent on tasks can be consistently tracked. It lays out possible sources of bias in the underlying data to allow for better-informed consumption of the final analysis. Also it discusses different approaches to remediating vulnerabilities such as fixing easy vulnerabilities first versus fixing serious vulnerabilities first.
Next, historical data from the fifteen remediation projects is presented. This data consists of the average cost to remediate specific classes of vulnerabilities – cross-site scripting, SQL injection and so on – as well as the overall project composition to demonstrate the percentage of time spent on actual fixes as well as the percentages of time spent on other supporting activities such as environment setup, testing and verification and deployment. The data on the remediation of specific vulnerabilities allows for a comparison of the relative difficulty of remediating different vulnerability types. The data on the overall project composition can be used to determine the relative “efficiency” of different projects.
Finally, analysis of the data is used to create a model for estimating remediation projects so that organizations can create realistic estimates in order to make informed remediate/do not remediate decisions. In addition, characteristics of the analyzed projects are mapped to project composition to demonstrate best practices that can be used to decrease the cost of future remediation efforts.
Robert Lemos from Dark Reading covered the talk in his article “Fixing Vulnerabilities on a Shoestring” We’re excited to see increasing focus from the media on this issue and hopefully our Remediation Resource Center can help put organizations in a position to more quickly address the software vulnerabilities they’re identifying. We’re also hoping that folks find ThreadFix to be a useful tool to help organize their software vulnerability data and get it transferred development teams to actually get fixed.
dan _at_ denimgroup.com