WhiteHat Security recently published their 2012 report on website security. Like Veracode, WhiteHat collects and analyzes data from security tests run across their customer base each year. WhiteHat's analysis focuses on data from dynamic testing of 15,000 sites at 650 organizations – all results manually reviewed and verified. From this data they are able to see trends and to build industry scorecards. The report makes for fascinating reading.
On average, web sites are getting more secure each year: the average web site had over 1,000 vulnerabilities in 2007, and only 56 in 2012. SQL injection, the most popular and most serious attack vector, is found in only 7% of their customer’s web sites.
This is the good news.
What made WhiteHat’s analysis this year especially valuable is that they also surveyed customers about their secure SDLC practices and the effectiveness of their security programs. Although the survey set was small (less than 20% of customers responded), this data allowed WhiteHat to correlate vulnerability data with secure SDLC practices operational controls, as well as appsec program drivers and breach data.
Compliance impact on Appsec
White Hat found that the main driver for fixing security vulnerabilities is compliance – this matches up with findings from the SANS Appsec survey last year.
But they also found that compliance is the number one reason that some vulnerabilities don’t get fixed: many organizations are following the letter of the law, doing what compliance says that they have to and only what they have to, not going any further even if it would make sense to do so from a risk management perspective or to meet customer demands.
Best Practices and Tools – What Works?
Training developers seems to help. More than half of White Hat’s customers had done at least some security training for developers. Organizations that invested in security training for developers had 40% fewer vulnerabilities and resolved them 59% faster.
But other best practices and tools don’t seem to be effective.
Just over half of customers relied on application libraries or frameworks with centralized security controls. Relying too much on these controls seems to provide a false sense of security: organizations that used security libraries or frameworks with security controls had 64% more vulnerabilities and resolved them 27% slower.
One factor that makes these organizations more vulnerable is that if the underlying framework is exploitable, then all of the sites that rely on it are vulnerable, like the recent security problems with Rails. Another problem may be that developers are naïve about what a security library will do for them: Apache Shiro or something like it for example will take care of a lot of application security problems, but it won’t protect your app from SQL injection or XSS or CSRF or other common attacks, leaving big holes for the bad guys. There’s more work that still needs to be done to make an application secure.
Organizations that use static analysis had 15% more vulnerabilities found through WhiteHat's dynamic testing, and resolved them on average 26% slower. Maybe because running a tool doesn't do anything if you don’t fix the vulnerabilities. Or because there isn't a high overlap between the vulnerabilities that static analysis finds and what’s found through dynamic analysis.
But Nothing Stops Breaches
85% of WhiteHat's customers test their apps pre-production, a third of them before every change is pushed out. These organizations are trying to do the right thing.
But almost one quarter of White Hat’s customers had experienced security breaches as a result of an application vulnerability. It doesn't seem to matter if they tested often, or if they trained their developers, or how much they trained them, or if they used use static analysis or secure libraries or a WAF or other operational security controls. These organizations were just as likely to experience a breach as organizations that didn't do as much training or as much testing or didn't use the tools.
WhiteHat’s report raises a lot of fascinating questions. Do the breach findings mean that security testing, or developer training or using secure libraries or other tools don’t work?
Or is this simply evidence of the essential asymmetry of the “Attacker’s Advantage and the Defender’s Dilemma”? Even though the number of serious vulnerabilities on average is declining significantly year on year, 86% of all the web sites that WhiteHat tested had at least one serious vulnerability (and keep in mind that WhiteHat - or any other vendor - can't catch every vulnerability). On average only 61% of these vulnerabilities were fixed and it took 193 days for this to get done. All it takes is one vulnerability for the bad guys to get in, and we’re still giving them too many chances and too much time to succeed.
Or maybe we just need more time to see the results of training and testing and tools and other best practices. Time for developers to understand and fix legacy bugs and to change how they design and build software to be more safe and secure in the first place, to “build security in”. Time for management to understand that compliance shouldn't be the main driver for building secure software. Time to raise the bar enough that the bad guys start looking for another, easier target. We’ll have to wait another year to see WhiteHat’s next report and see if some more time makes any real difference.
No comments:
Post a Comment