Technical due diligence for identifying cybersecurity risk in external parties
tl;dr - rolling up your sleeves and getting into the technical weeds of someone else's code can take a lot of effort, but it is the most effective way to identify unknown vulnerabilities in it.
Technical due diligence are steps an organization takes to vet its software supply chain for unknown vulnerabilities by itself conducting (or contracting another organization to perform) analysis of 3rd (and greater) party products, infrastructure, and networks.
These efforts are necessarily more time- and resource-intensive than any of the previously discussed ones. And not every enterprise even has the know-how to conduct or interpret the results of such vetting, so must spend money paying others to do so. Thus, managing risk stemming from external parties requires careful prioritization of such efforts.
With that said, I would propose three general types of technical due diligence one can perform on a 3rd (or greater) party to reveal unknown vulnerabilities: enhanced software composition analysis (SCA), proprietary code review, and penetration testing. These methods are by no means mutually exclusive, and employing one technique often implicates another (e.g. a penetration tester using a dynamic analysis tool to scan vendor code as part of the reconnaissance phase of his test).
In order to maximize the efficacy of technical due diligence, I would strongly recommend (or insist, if legally required) engaging the relevant technology provider before beginning such an exercise. This will give the organization subject to the diligence a chance to proactively find any issues that the tools or test might identify – along with mitigating information – which will help reduce the administrative burden on both sides and build mutual trust. Such trust is critical due to the relative information asymmetry inherent to such a relationship. Even assuming you are able to communicate with the publisher of the software or service (e.g. it is one of your vendors and not an open-source project without a clearly responsible person), in many cases you will need to take the external party at its word with respect to the internal operations of the software in question. Of course, receiving a less-than-perfect assurance that expands your understanding of the risk picture is far better than leaving the hard questions unasked.
Enhanced software composition analysis
Probably the simplest form of technical due diligence available is reviewing the dependencies of 3rd party software upon which your organization relies (e.g. reviewing the 4th party code in your technology stack). Both open-source and commercial SCA tools abound, and can often be used directly on binaries provided to you by the supplier of said products. Unfortunately, Software-as-a-Service (SaaS) vendors do not generally make their executables available to customers. Thus, you either need some sort of compelling leverage (e.g. impending closure of a large deal) to acquire these artifacts or have to rely on a software bill of materials (SBOM), assuming the vendor is willing to provide one. Additionally, the vast majority of existing SCA tools focus exclusively on identifying known vulnerabilities in 3rd party software. While this is helpful to ensure that your suppliers' development teams are avoiding obviously dangerous or malicious packages, such tools do little to help identify previously unknown threats and is thus outside the scope of this post.
This is where what I call "enhanced" SCA comes into play. To address the aforementioned challenge, a new generation of machine learning-enabled tools are emerging to help developers understand the provenance, reputation, and latent cybersecurity risk of specific packages and their dependencies. Employing these products to vet 4th party code proactively can help you understand if your software vendors are relying on potentially unsafe or even hacker-controlled libraries, even without having identified specific vulnerabilities.
Furthermore, although potentially time-intensive, looking at the age of open source libraries used in your applications can also be useful. A common refrain in the security world is that “software ages like milk, not wine,” which concisely describes problems posed by older dependencies. Although there may not be any identified vulnerabilities in such long-in-the-tooth libraries, all things being equal, they are more likely to have unknown flaws lurking in them than recently updated ones.
Review of proprietary code
The next level of technical due diligence would involve other types of security scanning against the proprietary code of the 3rd party in question, e.g. products and services your organization uses but does not create. Doing so presents some challenges, even assuming you are hosting and running them yourself. Short of decompiling or otherwise reverse engineering a given application (which would likely represent a breach of your contract with the vendor), you would probably not be able to conduct static analysis security testing (SAST) on the source code itself. Doing so against open-source code, while usually permissible, would likely result in a slew of findings that you will have difficulty interpreting. It is possible, however, to employ interactive (IAST) as well as dynamic application security testing (DAST) against such code at runtime. These tools will also likely return a swath of results, many of which will be false positives. With that said, malicious actors reconnoitering your 3rd party software will probably also be running similar tools to identify vulnerabilities, and it would behoove you to identify - and push for the swift remediation or mitigation of - such flaws before they do.
Penetration testing
Finally, the most intensive level of technical due diligence of a >3rd party would be penetration testing. These types of security evaluations have historically been incredibly expensive and difficult to scale effectively. Thankfully, a new series of automated or “as-a-Service” penetration testing products are now coming to market. Their consistent and repeatable nature also allows for a true “apples-to-apples” comparison between targeted networks. While using such tools and techniques against 3rd party organizations will likely be challenging from a business, compliance, and logistical standpoint (and likely impossible to do legally against 4th party networks), there is some precedent for doing so. Conducting such a test would thus offer the most comprehensive risk analysis possible, and may be appropriate in the case of a 3rd party handling especially sensitive data or business processes. Also important is the fact that providing clear evidence of an exploitable vulnerability is the surest way to spur action on the part of a software publisher.
Conclusion
As I have written previously, you must conduct an iterative process when evaluating information security risk stemming from your organization's reliance on >3rd parties. Every investigation or analysis you undertake necessarily has an opportunity cost, and it is not feasible for most organizations to conduct an exhaustive review of every external party upon which their data relies. Relying on weaker indicators, such as security ratings, or even what I would consider to be mere heuristics (such as the ability of a vendor to provide a SOC 2 attestation) may be the best you can do in some cases. For especially critical business functions, however, using some of the technical due diligence techniques described above might be warranted. Weighing the relative costs and benefits will depend on your individual situation, but my hope is that this series of articles has provided you with some tools to make more informed risk decisions.