During an annual risk assessment, the cybersecurity governance committee needs to decide which of several identified risks should be mitigated first. Which metric would give the committee the best indication of each risk's probability so they can rank the risks in order of urgency?
Evaluating how often a given security incident could occur within a year.
Tracking the number of software updates and patches released per week.
Reviewing the historical time between successful incidents of specific natures.
Assessing the security features of the latest technologies implemented.
Evaluating how frequently a particular security event is expected to occur over a one-year period provides the annualized rate of occurrence (ARO). A higher ARO reflects a greater probability that the risk will be realized within the year, so risks with the highest ARO typically rise to the top of the mitigation list. By contrast, reviewing new technology features, patch cadence, or the historical time between past incidents may inform other aspects of the security program but do not directly express the likelihood of a future event, making them less effective for initial prioritization.
Ask Bash
Bash is our AI bot, trained to help you pass your exam. AI Generated Content may display inaccurate information, always double-check anything important.
What is the Annualized Rate of Occurrence (ARO)?
Open an interactive chat with Bash
How does ARO differ from other risk metrics like SLE and ALE?
Open an interactive chat with Bash
Why are metrics like patching frequency or historical time between incidents less effective for prioritizing risks?