Defense in depth is a common concept when designing a cybersecurity strategy. The idea is simple – you should not rely on a single tool or practice, but have multiple controls, either technical or procedural that provide redundancy so that if one control fails the others can protect you. A common metaphor used to explain this concept is the seatbelt metaphor. Why would you install both a WAF and a RASP? For the same reason your car has both a seatbelt and an airbag. Sure, in most cases they are redundant, and while they work differently, they both serve the same ultimate goal. However, with the stakes being so high, you should be willing to indulge this redundancy. If either of them fails, you’d want to have the other to protect you.
Many security architects quote this metaphor, but misuse it by not taking it far enough. Seatbelts and airbags may both save your life, but they aren’t the first line of defense – driving safely is. Having a seatbelt or airbags doesn’t mean you should drive recklessly, it just means that if some circumstance beyond your control occurs, you have a fighting chance to survive the accident. With this metaphor, the first line of defense, or the cybersecurity equivalent of driving safely, is having competent, empowered application developers who are aware of security considerations and design and implement secure applications to begin with.
Security professionals often joke that application developers are the source of all their trouble, since those developers are the ones who introduce the security flaws in the product or service to begin with. While this tongue-in-cheek description of the relationship between application developers and security professionals is often said affectionately, it reveals a cultural problem many organizations suffer from. By seeing developers as adversaries instead of allies, many security professionals rob their organizations of their first line of defense.
The first step in harnessing the organization’s developers is educating them. Most organizations will invest in the professional development of their developers, be it by offering training courses directly, financing or subsidizing external studies or even just by encouraging developers to devote some of their work time to self-studying or attending professional conferences. While application security may not be the sole focus of most developers, it’s important to make sure that those interested in such topics have access to relevant learning materials or at least are able to take advantage of their employer’s incentives to expand their knowledge in those areas. The main purpose of this education is not to make every application developer a security expert, but to make them conscious of security considerations. The most important thing this newly found awareness will achieve is making the application developers buy in to the notion that they are in fact the application’s first line of defense.
Once they have bought into this notion, it’s time to drive it home by empowering them to really own the security of their applications. In my experience the best way to do this is not to expect the developers to add something new into the processes they already employ, but to bring application security into their world and turn it into a part of those processes. Application Security Testing (AST) tools are often the missing piece in the puzzle. The good tools don’t just find vulnerabilities but do so as a seamless part of the developers’ everyday experience – either by integrating directly into the IDE when possible or as another step integrated in the CI/CD process. This integration shouldn’t stop with detecting a vulnerability the developer invertedly introduced to the code. The entire experience should be as seamless as possible. For example, such a vulnerability may be translated to a warning, or even an error in the IDE, a failed CI pipeline, or even a ticket in whatever issue-tracking software the organization uses. Similarly, when the vulnerability is fixed, the error should disappear, the pipeline pass or the ticket closed. A good AST tool will also offer some additional information and context explaining the vulnerability, and thus help educate the developer who introduced the original vulnerability so they may avoid doing the same mistake again. In this sense, AST tools aren’t just controls that prevent vulnerabilities from being introduced to the codebase but teaching tools which help developers deepen their understanding of application security and prevent future vulnerabilities from ever being created.
Application security is often sensationalized and mystified in mainstream, or even technological, media, which often causes security professionals to enjoy considerable social capital in the organizations that employ or hire them. It may be counter-intuitive, but, as I’ve illustrated above, in order to do a more effective job, it’s often in their best-interest to demystify their field of expertise. By making security testing as mundane as possible, it can become an integral part of every developer’s day-to-day job. By doing so, security professionals can harness their organizations’ developers as the first line of defense. This not only exponentially increases the sheer number of people devoting their time and effort to security the organization, but also frees up the security specialists time to handle the more “exotic” issues which require the full range of their expertise.