Net Platforms

iStock-1130210581

Data-driven cyber: empowering government security with focused insights from data

How ‘small but actionable’ insights can improve behaviours and decision making.

In recent months, the NCSC has been accelerating its approach to data-driven cyber (DDC). Our goal is to encourage the adoption of an evidence-based approach to cyber security decisions, not only in how we advise external organisations, but also in how we address our own security.

We acknowledge that enterprise cyber security is becoming increasingly complex, and many teams are reluctant to introduce an additional ‘data layer’ due to concerns of becoming overwhelmed. In this blog post, we aim to demonstrate how concentrating on manageable, actionable insights can help teams embrace data-driven cyber security.

Our example showcases a collaboration between two teams within the NCSC:

  • the Vulnerability Reporting Service (VRS)
  • the Data Campaigns and Mission Analytics (DCMA) team

The Vulnerability Management Team leads the NCSC’s response to vulnerabilities, while DCMA use their expertise in data science and analysis to provide the NCSC Government Team with evidence based security insights.

Small actionable insights drive action

Many government teams, including the VRS, gather and manage vast amounts of valuable data. The challenge they face is how to best analyse this, given the misconception that developing any useful insights requires a complete overhaul of existing workflows.

This misconception stems from the idea that implementing DDC involves plugging all data into a complex ‘master formula’ to unveil hidden insights and narratives. However, it’s essential to recognise that, especially in the beginning, DDC should be viewed as a tool for generating ‘small yet actionable insights’ that can enhance decision-making. This simpler and more focused approach can yield significant benefits.

Vulnerability Avoidability Assessment

In the case of the VRS we did exactly that, starting with the data sets that were available to the team and then focusing on a single insight that could be used to have a meaningful evidence-based security conversation.

To this end we created the Vulnerability Avoidability Assessment (VAA), an analytic that uses two internal data sources and one public source to determine what proportion of vulnerability reports were a result of out-of-date software. The data sources comprised of:

  • number of vulnerability reports received by VRS
  • number of reports where out-of-date software was listed as a reason
  • public vulnerability disclosure database

We created this analytic knowing that patch management is one category of vulnerability that could be influenced, and that diving deeper into the link between patch management and the vulnerabilities reported through the VRS would provide us with a security discussion point about how vulnerabilities can potentially be avoided or reduced.

Our analysis

We gained a deeper insight into the impact of unpatched software on government systems by comparing the number of vulnerability reports resulting from outdated software with information from an open source database. This database provided estimates of how long these vulnerabilities had been publicly known, and when patches had become available.

Using the above approach we were able to define an ‘avoidable vulnerability’ as one that has been publicly known for a considerable time, to the extent that a responsible organisation would reasonably be expected to have taken the necessary actions to apply the required updates and patches.

Our analysis of data from 2022 (refer to Table 1, below) revealed that each month the VRS receives a considerable number of vulnerability reports directly linked to software that was no longer up to date. Ranging from 1.6% to a peak of 30.7% of vulnerabilities in a single month, over the course of the year.

TABLE 1. TOTAL NUMBER OF OUT-OF-DATE SOFTWARE REPORTS COMPARED TO THE TOTAL NUMBER OF VULNERABILITY (VULN) REPORTS RECEIVED FOR 2022.

We also investigated how long the software vulnerabilities went unpatched before they were exploited. Referring to NCSC guidance, which recommends applying all released updates for critical or high risk vulnerabilities within 14 days (NCSC Cyber Essentials guidance on ‘Security Update Management’, Page 13), we chose a 30-day buffer as a consistent timeframe for applying patches, regardless of their severity. Separating the timelines into these increments, we found that 70% of outdated software vulnerabilities reported to the VRS were due to software remaining unpatched for more than 30 days (refer to Chart 1, below).

CHART 1. SHOWCASES THE LENGTH OF TIME A VULNERABILITY HAD BEEN IN THE PUBLIC DOMAIN.

This newfound understanding provided the VRS team with sufficient data to have an evidenced based discussion with stakeholders regarding their approach to patch management. Providing the data insights to support a case for meaningfully reducing the number of vulnerability reports received by the VRS against government systems.

Conclusions

The journey towards DDC has highlighted the immense value of leveraging data to make evidence-based security decisions. The collaboration between the VRS and the DCMA team serves as a concrete example of how data can inform decision making. It is essential for organisations to recognise that adopting DDC does not require a complete overhaul of existing systems, but rather the ability to focus on extracting small but actionable insights that can drive behaviours and decisions.

Source: Data-driven cyber: empowering government security with… – NCSC.GOV.UK