Close to a year ago I wrote an article decrying the futility of pen testing that periodically gets resuscitated on Twitter. In relatively stark terms, it called out reasons why manual web testing remains important, but insufficient, inconsistent, and imperfect. The intent was to push the boundaries of the comfort zone in which we accept, “It’s always been done this way and therefore always will.”
Recently Haroon Meer explored this topic more thoroughly at 44con with the presentation, “Penetration Testing considered harmful today“. I recommend watching the recorded presentation or reviewing the slides. Just as the topic has enormous potential for concern trolling and indifferent dismissal, it has potential for constructive discussion. If you’re a pen tester, set aside the idea that you’ve been personally attacked by the mere hint of criticism. Instead, consider the points made about the importance of pen test quality, evaluating real-world threats vs. the threat posed by a single test team, and how or why large organizations still suffer compromises (from Sony to RSA to certificate authorities). Answer those questions well and you’ll establish yourself as a premium service rather than a disposable commodity.
The question, “Are we actually improving anything?” isn’t unique to pen testing. Security software needs the same attention (thus recurring questions about whether AV remains relevant1). Last year’s modsecurity challenge produced several good lessons (and should be commended for transparency). One interesting lesson related to pen testing was that the “Time to Hack” a site with SQL injection was about 10 hours. Don’t generalize that number beyond the challenge itself, but consider how relatively short that is in terms of finding and exploiting a vulnerability — all the while bypassing a basic set of modsecurity rules. Would the pen testing team you hired be as efficient or effective? And then what would you do if you had 100 similar sites to review? Hire another 100 pen test teams?
Let’s return to web app testing. I previously lamented the lack of coherent formats for sharing test results. Static PDF files are poor enablers of improving and maintaining security after a pen test, regardless of how well-described a vulnerability may be. Instead of (or in addition to) a snapshot of the app’s stance, a collection of re-usable data would help developers not only review findings, but ensure those vulnerabilities aren’t reintroduced.
- Provide a common format for reproducing proof of a vulnerability.
- In a manner that can be easily executed by a non-technical consumer (e.g. just need a browser and an Open Source plugin).
Whether you agree or not with the “Futility” article or Haroon’s presentation, trolling 140 characters at a time adds little to the conversation. Why not find a better outlet to prove pen testing is already perfect (!?) or improve its accepted deficiencies. If you can code, there are projects to contribute to like
If you like to write words instead of code, there are projects like the OWASP Testing Guide or adding more language-oriented examples of countermeasures for the OWASP Top 10. It never hurts to improve the signal to noise ratio of web security.
1 The sandboxing in mobile devices lessens the utility of anti-virus in the desktop sense; however, tools to protect privacy, detect malicious apps, or detect undesirable apps (that intentionally scrape data) are important.