The Deploy Securely risk assessment model - version 0.3
A lightweight tool for evaluating security vulnerabilities.
I am happy to reveal version 0.3 of the Deploy Securely risk assessment model!
The purpose of the DSRAM (working title/acronym) is to allow information security professionals to evaluate the financial risk posed by individual vulnerabilities in a simple and, potentially, automated, manner.
The model is built around the classic risk management formula:
Risk = Likelihood x Impact
In cybersecurity terms, this is equivalent to:
Annualized Loss Expectancy (ALE) = Annualized Rate of Occurrence (ARO) x Single Loss Expectancy (SLE)
The DSRAM specifies inputs for each of these factors, allowing you to arrive at a dollar value for the risk posed by a given vulnerability.
Some comments/caveats here:
This version is only applicable to published common vulnerabilities and exposures (CVEs), as available in the CVE list. While I plan to eventually develop a model that allows for evaluation of all vulnerabilities, I am starting with CVEs because the open-source exploit prediction scoring system (EPSS) provides an excellent and accessible way for predicting the likelihood of a malicious actor exploiting a given CVE. My initial efforts at forecasting non-CVE exploitation ran into some serious headwinds (predicting the future is hard!), and the results are not yet ready for prime time. Once I have the kinks ironed out, I will provide this in a subsequent future version.
While data-driven, this a heuristic model. It is built around historical data, but as I have mentioned before, ascribing a causal relationship to any single factor or combination thereof is not possible. As I have also mentioned, this isn’t especially important for the purposes of practitioners. Rapidly understanding your approximate risk exposure and moving quickly to manage it (accept, transfer, mitigate, avoid) is the name of the game here.
The DSRAM necessarily results in an “all-other-things-being-equal” output, meaning that it does not take into account unique situations or controls specific to individual organizations.
This isn’t a replacement for the Factor Analysis of Information Risk (FAIR) model, which is likely to provide you with a higher level of granularity and precision. But the DSRAM requires far less effort to implement and can be applied to individual vulnerabilities quickly and easily, which is not generally true for FAIR.
I want your feedback! As mentioned, this is version 0.3 (following some initial iterations on LinkedIn), and I expect many revisions in the future.
If you want to dive in and start testing the model, then feel free to run this Python notebook on Google Colab. But I would recommend reading below for an explanation of the various inputs. Since the exact values and fields are likely to change, this page will serve as reference for general considerations, rather than diving into the details of specific calculations. Additionally, I may amend this post occasionally to ensure it is up to date.
Likelihood
The first question you should ask when evaluating the likelihood of exploitation of an issue is whether it is published on the common vulnerabilities and exposures (CVE) list. If the answer is “yes,” then you can just use the EPSS rating for the CVE and adjust it for an annual basis.
On that note, I want to be clear that I am currently using a very rough method for annualizing the EPSS score (which is provided by FIRST for the next 30 days, rather than the next 365). For details, check out the comments in the appropriate section of the notebook.
If the vulnerability is not a CVE, then unfortunately this version of the DSRAM is not applicable, and you’ll need to use some other method until I figure out how to evaluate non-CVEs.
Impact
The second part of the risk management formula requires you to estimate the likely impact of a cyber incident, in financial terms, using the SLE. I use the confidentiality, integrity, and availability (CIA) triad as the basis for my analysis here, calculating the SLE for each leg of the CIA triad.
Confidentiality
This is probably the most difficult attribute to estimate. The exposure of a single technical record with a company’s “secret sauce” could essentially put it out of business while the loss of millions of records of another type - like the sequential integer primary keys for a database (but no other information) - would probably result in no loss, aside from potential reputational damage.
To guide your analysis, start with the question “what will be the monetary costs if this information gets into the wrong hands?
To figure this out, an important place to start is defining “the wrong hands.”
Sometimes, information marked “proprietary” or “sensitive” is really neither of those things, and nothing bad would happen if you posted it on the internet.
Conversely, prematurely sending a contract proposal, with no markings on it, to a business partner with whom you have a non-disclosure agreement (NDA) could potentially ruin your negotiating position, costing you real dollars even if the information never becomes public.
Once you have determined who should see what information, think about the damage caused if someone outside that group gets access to it.
Are you subject to fines or penalties due to statutes or regulations like HIPAA or state data breach laws?
What is the reputational cost if you expose your customers’ private information?
How will your competitors take advantage of important technical data if it leaks?
Finally, put a price tag on it.
The IBM "Cost of a Data Breach" report is a good place to start for determining individual record values.
Additionally, you can look at historical regulatory actions for industries and situations similar to yours to determine what the penalties might be for losing control of various types of information.
Determining competitive impact can be challenging, but work with your business-line colleagues to determine what edge (if any) you have over the competition (in dollar terms like revenue), and what pieces of information would give them a leg up.
The DSRAM calculator has ported over some of these values from the IBM report to make calculations easier, but you can write in your own values, if desired.
Integrity
Evaluating data integrity impacts is slightly easier than doing so for confidentiality. Here are some of things you should consider when doing so:
Is this the only copy of the data?
- If so, then you focus your evaluation on what happens if you never get access to this information again.
- If not, look at the backup and restore costs, as well as the indirect impacts to your business operations.
Is the damage reversible?
- If so, then calculate how much it would cost to get back to a “good” state. Don’t forget to incorporate the opportunity costs of any data loss incurred during this rollback process.
- If not, consider whether you could recover anything of value from the corrupted information. If it’s encrypted with AES after a ransomware attack*, you are probably out of luck. But if one digit/letter is thrown out of place in the same way for each record, you can likely write a script to repair the data in short order.
* On the ransomware note, make sure you aren’t double-counting the costs to both integrity and availability. It’s a judgment call as to whether effectively irreversible encryption impacts data’s “integrity,” but frankly it’s an academic point. What matters is the resulting damage to your organization.
What are the second order effects and costs?
- How will your engineering and analytics teams be impacted when reacting to a data integrity incident?
- What are the reputational costs in the eyes of your customers and partners if your organization lets their data get messed up?
Availability
To evaluate the financial impacts of losing access to data due to a cybersecurity incident, think about the following:
What processes or operations rely on this service being available?
- If you are Amazon taking eCommerce orders through a web portal and make ~$13,995 in revenue each second, then having a few minutes of downtime would be massively expensive.
- If you are a marketing agency doing most of your work asynchronously, though, it’s probably not going to break your business if you are offline for a few hours (although it might, if you are up against a key deadline!).
Will I need to pay any direct costs as a result?
- Would you breach any service level agreements (SLAs) based on the resulting downtime? For how long? What are the penalties?
- Could customers churn, even if an SLA is not breached, due to the downstream impacts to their businesses?
Will there be headline risk resulting from serious downtime?
- Last summer Akamai Technologies popped into the news as a result of triggering some outages. Most people probably didn’t know who Akamai was until their internet connections went down, so this event probably did not make a great first impression.
In this day and age, people have built their lives and businesses around highly available services. Make sure you understand what happens if yours goes offline.
Conclusion
Once you have determine the likelihood (ARO) and impact (SLE) of an incident, you can multiply them together to get the annual cost of the relevant cyber risk (ALE). You can use this value for all sorts of things, such as during annual planning and budgeting or when comparing the costs and benefits of deploying engineering resources to tackle a security flaw. With the appropriate information in hand, you can make rational decisions that allow you to manage risk appropriately and optimize for whatever your organization’s desired outcomes are.
I hope that this model will be helpful to you, and at least partly delivers on the promise I made when I first launched Deploy Securely: to provide a way to communicate about cybersecurity risk in concrete terms that are comprehensible to the broader business community. I look forward to continuing to work on this project, with your help.